A Welcome Step Forward, but What About the Night?
On May 9, 2024, the U.S. National Highway Transportation Safety Administration (NHTSA) issued a final rule mandating that all passenger vehicles and light trucks sold in the United States after September 2029 must be equipped with an Automated Emergency Braking System (AEB). This is a significant step forward in mainstreaming technology that is already standard in all new luxury vehicles and available as an enhanced safety upgrade in most mass-market models. However, while the NHTSA's decision is welcome for driver, pedestrian, and cyclist safety, the effectiveness of these systems, particularly in night driving conditions, remains a concern due to the limitations of cost-efficient sensors used in mass-market vehicles.
The automotive electronics development cycle is considerably longer than that of complex consumer electronics like smartphones, often compared to dog years. The NHTSA 2029 mandate is expected to trigger a push among automotive OEMs to meet the new requirements economically within the next five years. AEB technology, first introduced by Volvo in 2010, has proven effective enough over time to become pervasive. The most advanced AEB systems combine a variety of sensors and sensor types (radar, camera, lidar, ultrasonic) and the silicon processing power to enhance accuracy and reduce false positives, which can potentially cause collisions that the systems are designed to prevent. However, last month’s mandate is bound to have an impact on some OEMs, forcing them to balance accuracy, BOM cost, and system complexity for mass market vehicles.
A critical issue for AEB systems is their performance in low-light conditions. Research supports the need for improved nighttime AEB performance. According to Jessica Cicchino's study in Accident Analysis & Prevention (AAP), "AEB with pedestrian detection was associated with significant reductions of 25%-27% in pedestrian crash risk and 29%-30% in pedestrian injury crash risk. However, there was no evidence that the system was effective in dark conditions without street lighting…"ã€Jessica Cicchino, AAP May 2022】
The effectiveness of automotive CMOS image sensors commonly used in these systems diminishes after dark. This is particularly concerning since drivers with limited visibility and reaction time are most dependent on AEB systems and other ADAS systems at night. Pedestrian fatalities in the U.S. have nearly doubled since 2001, with over 7,500 deaths nationwide in 2021, and about 77% of pedestrian fatalities happening after dark. Although the NHTSA's ruling is a positive move towards improving safety, the challenge of cost-effective solutions for nighttime driving remains for nearly 80% of the vehicle-on-pedestrian fatalities the mandate certainly seeks to mitigate.
Fortunately, AI-based computational imaging offers a promising solution. By applying real-time denoising using neural networks and embedded neural network processors (NPUs), the nighttime range and accuracy of automotive sensors can be significantly enhanced. This AI denoising software runs on existing automotive SoCs with embedded NPUs and removes temporal and spatial noise from the RAW image feed from the sensor before processing, allowing the analog gain and exposure time to be increased without increasing the associated sensor noise.
领英推è
This method does not require any modifications or recalibration of the existing image signal pipeline (ISP). In initial OEM road tests, AI denoising works effectively with both high-cost low-light-capable sensors and mainstream automotive CMOS sensors, effectively giving them "steroids" for better and more accurate night vision. This improved night vision translates into earlier and more accurate computer vision results such as nighttime pedestrian detection in AEB systems.
Since this is a software upgrade to existing and planned ECUs leveraging existing/roadmap Tier-2 fabless SoCs, the time required for integration, testing, and productization is much lower compared to hardware-based alternatives.
I am proud to be part of a dynamic team of AI computational image scientists and software engineers who are changing the world by delivering technology that will potentially mitigate thousands of fatalities in the coming years.
For more information on how AI-based computational imaging can improve the nighttime performance and accuracy of ADAS, as well as human vision-assist systems, contact me via LinkedIn or consult one of our Tier-2 fabless partners about their adoption plans for AI-based computational imaging from Visionary.ai.
David Jarmon david.jarmon@visionary.ai