CES 2024 - Automotive Review: Hydrogen Expansion, AI-Driven Perception, and the Rise of Software-Defined Vehicles
The largest consumer electronics and automotive show, CES 2024, showcased a range of innovations, particularly in the automotive sector. AI-enabled everything hogged the headlines with crazy examples, such as an app priced at $10 monthly for an AI-enabled baby cry translator. This app supports parents by informing them if their baby is hungry or needs a diaper change. However, beyond such crazy innovations, the MarketsandMarkets team observed several significant trends in the automotive industry that we believe will have far-reaching implications. It’s important to note that this report is not a comprehensive summary of all announcements but focuses on those we believe will significantly impact the future in-car experience.
1. Hyundai is going all-in on expanding its hydrogen business: In the US, Hyundai and Kia have been successfully producing and selling appealing battery electric vehicles such as the Ioniq 5, Ioniq 6, EV6, and EV9, featuring cutting-edge 800V architecture and software-defined features. This raises the question: why is Hyundai so heavily invested in the Hydrogen economy at CES?
?2. Diversification of the ADAS to L3 Automated Driving Perception Market: This CES highlighted practical advancements in the ADAS and Automated driving ecosystem rather than purely autonomous systems. Several sensor suppliers, perception software suppliers, and system integrators focused on ADAS and L2 applications. These included supporting European GSR-mandated features like Intelligent Speed Assistance (ISA), which will take effect in 2024, and more complicated Level 2, 2+, 3, and 4 automated driving features. Key players that caught our attention include Mobileye, Luminar, StradVision, LeddarTech, and Owl Autonomous Imaging.
o?? At CES, Mobileye unveiled its comprehensive portfolio spanning ADAS to AD Level 4 solutions. The portfolio uses the EyeQ6 High SOC along with cameras, Radar, and Lidar as needed, depending on the level and complexity. Mobileye can support European GSR features like Intelligent Speed Assistant (ISA) using vision and its REM map.
o?? Besides current OEMs such as Porsche, VW, and Zeekr, Mobileye also announced a large OEM contract, including 17 models that will utilize its Supervision, Chauffeur, and Drive platforms with SOP starting in 2026. Mobileye has always been focused on the ADAS and semi-automated driving markets, and it is evident with this roadmap that they are focused on taking a computer vision and map-fused approach for lower-level ADAS and AD systems and a redundant sensor approach, including Lidar and Radar for higher levels of automation with expanded ODD.
o?? Mobileye is largely continuing its black box approach with its complete roadmap to focus on the EyeQ chips and its proven perception software. It is now adding new components like the REM map and RSS rules. This strategy is expected to work for many OEMs, as can be witnessed from Mobileye’s recently announced OEM wins.
o?? The company’s solution, SVNET, is expected to be integrated across 1.5 million vehicles across 50 models in China and Europe by 2024. SVNET scales across L1 to L3 applications and can support various vision systems, ranging from basic front vision ADAS supporting EU GSR requirements to SurroundVision (SVM) and Multivision, all providing support in the perception and path planning stages.
o?? At CES, the company announced its 3D perception improvement that allows for more accurate measurements. Given the increasing focus on camera-enabled systems for ADAS to L2 systems, StradVision, with its scalable solution, will be a crucial player partnering with tier 1s and OEMS directly in the ADAS/AD ecosystem.
o?? The key differentiator with StradVision’s approach is that it hits competitors directly with their black box approach—silicon plus perception software plus map type approach, which many OEMs have wanted to get out of. StradVision’s SVNET can be ported across compute platforms, giving OEMs the flexibility to choose based on the vehicle model, compute power they can afford, and level of ADAS/AD system they are developing.
o?? At CES, the company showcased its low-level sensor fusion-based perception software, where the AI model works on fused sensor data from Camera, Lidar, Radar, HD Map, and GNSS. The company claims that rather than running the perception model on individual object data and fusing it, fusing the sensor data and running the perception model delivers better results.
o?? LeddarTech, also like the other perception vendors, showcased architecture and support for ADAS to L2/L2+ type solutions that can run on TI and other computing platforms. The key differentiator for LeddarTech is the ability to do the sensor fusion and run its perception model since many OEMs do not favor vision-only systems and, at minimum, want an HD map and Radar to add redundancy as it scales to hands-free systems.
领英推荐
?
o?? At CES, the company unveiled new use cases for its IRIS+ Lidar, automatic emergency steering (AES) that can swerve the vehicle into the nearby lane to avoid a last-minute crash and brake automatically. For city speeds, this solution reacts faster than a traditional camera and radar-based system, which avoids crashing into the pedestrian, who, in this case, was a dummy kid who appeared at the last minute.
o?? We also had the chance to demo Luminar’s HD mapping solution that uses its proprietary fingerprint technology to generate an HD map on the go for a specific route. The demo utilizing Luminar IRIS+ Lidar, its HD map, and full stack learned the driver’s route in a parking lot, created an HD map for the same, and could drive hands-off on the same route afterward. Currently working at low speeds, Luminar also aspires to take this to highway speeds. And the best part is that the data used to generate the HD map never leaves the car, so there are no concerns about data ownership or privacy issues.?
o?? This whole memory type function and generated HD map on the go are crucial differentiators for Luminar. It allows OEMs to scale their level 2, 2+, and 3 solutions to a larger ODD, including complex urban conditions with localization and mapping already accounted for, as compared to traditional HD map offerings where these players need to generate a map for the additional roads beforehand, which takes time and investment slowing down OEMS.
?
3. Tier 1s are moving to practical innovations – seats and eCorner modules: Leading tier 1 suppliers such as Magna and Hyundai Mobis focused on areas such as ADAS and SDVs and showcased exciting innovations like the long rail seating system.
?
4. Startups moving SDV efforts into the mass market: Sonatus is a key company in the SDV ecosystem, which showcased solutions under the Sonatus Vehicle Platform use cases that will enable mass market OEMs to transition faster to software-defined vehicles and personalization.
?
5. AI-enabled assistants will enhance the in-car voice experience: With generative AI and large language models (LLM), in-car voice assistants are expected to become more sophisticated. While voice assistants have shown gradual improvements in the last few years with cloud-connected ones like Amazon Alexa and Google Assistant, the overall capabilities of these assistants in handling complex conversations are still limited. With Gen AI, these voice assistants are expected to step up to handle complex queries, proactively engage with the user and have human-like intelligent responses in a harsh automotive cabin environment. Mercedes Benz, BMW, and VW introduced next-generation digital assistants utilizing Gen AI capabilities.
Conclusion - In summary, the automotive segment of CES 2024 focused on gradual innovation for production cars, including perception frameworks, gen AI-enabled voice assistants, and next-gen seating systems. The MarketsandMarkets team, led by Praveen Chandrasekar (VP - Automotive), is actively researching and publishing studies on these technologies impacting the automotive industry. For more information and to get in touch for exciting collaborations drop him a mail at - [email protected]
?