LIDAR vs. Camera — Is Sensor Fusion the Future of Autonomous Vehicles?
Are cameras enough to achieve level 5 autonomy?

LIDAR vs. Camera — Is Sensor Fusion the Future of Autonomous Vehicles?

Can autonomous vehicles navigate merely using cameras, or should manufacturers move towards sensor fusion?

The question has been a point of contention in the world of autonomous vehicles for a long time. In my opinion, the only way to unlock the full potential of autonomous driving and achieving higher levels of autonomy safely is by combining all critical sensors so they can complement each other.

Here’s why:

Let’s begin with #cameras. They are great at capturing colour and providing texture and contrast data. But cameras can only detect when there’s adequate lighting and reliable weather conditions. Most importantly, they also can’t provide distance information, and at least two cameras are required to acquire a 3D image. A list of the strengths and weakness of cameras can be seen in the graphic below.

Strengths and weaknesses of Camera vs LiDAR

#Radars can help circumvent some of these problems, as they can determine the distance information with great precision along with the speed of the object and can also operate in adverse weather conditions. But due to low resolution, they have a hard time in differentiating and classifying objects – not a good thing if you are on the road and your car can’t decide whether the object is a car changing lanes or just a stray shopping bag!

Es wurde kein Alt-Text für dieses Bild angegeben.

Ultrasonic sensors can detect objects regardless of the material or colour. Still, due to the very limited range of <10m, they are only helpful for applications like parking assistance and blind-spot detection.

Es wurde kein Alt-Text für dieses Bild angegeben.

This is where #LiDARs can be crucial, as they operate in both short and long ranges. Not only can they create a high-resolution 3D map of the environment, but they also help categorize objects with great precision and primarily independent of environmental factors.

Es wurde kein Alt-Text für dieses Bild angegeben.

If we want to make safe and dependable Autonomous Vehicles a reality, sensor fusion incorporating the best elements of LiDAR and other sensors must be our go-to strategy.

Autonomous Vehicles Sensor Fusion

I dissected the topic in detail in the blogpost below, do give it a read!

And I would love to hear your thoughts on the topic.

#autonomousvehicles #sensorfusion #selfdrivingcars #objecttracking #technology #innovation #autonomouscars

Simon Schwinger

Gesch?ftsführer / CEO at Jenaer Antriebstechnik

3 年

Interesting read, thanks for sharing Florian! At #Jabil #Optics we understand the technologies behind and support on the way to #massproduction.

Daniel Ruiz

Non Exec Director | Strategy, Regulation, Safety, Transport, Growth

3 年

There's another dimension: intelligent infrastructure. The video analytics work of companies like Vivacity Labs must surely de-risk and accelerate the deployment of vehicles equipped with any kind of on board system? Another consideration is that public acceptance and trust will be dramatically increased if there is redundancy in the systems providing situational awareness. #intelligentinfrastructure #videoanalytics #Centre for Connected and Autonomous Vehicles

Ankur Sabmbh

GIS / Sonar / Marine/ Corporate Communicator at M&M

3 年

Amazing . on the other hand , Increasing investments in LiDAR startups by automotive giants and LiDAR Market Worth $2.8 Billion by 2025 https://www.prnewswire.co.uk/news-releases/lidar-market-worth-2-8-billion-by-2025-exclusive-report-by-marketsandmarkets-tm--877786028.html

回复
Paul Newman

Paul Newman CBE FREng FIEEE FIET is a creator, pioneer and innovator of autonomous vehicle (AV) technology.

3 年

I reckon : Use vision and radar and maybe lidar. Try not use gnss as it will let you down. Fuse early and fuse late. Have Independant sensing and independent algorithmic processing routes through the stack. Fuse in several ways - there are many to choose from.

Holger Loebel

Sensor fusion software for ADAS and AD.

3 年

Thanks Florian. I appreciate that you mention the complexity that comes along with sensor fusion. We see many organization starting their sensor fusion journey, and first results are typically quite quick to achieve thanks to a lot of know how that is available today. However, when the system grows and should be hardened for formal requirements like ISO 26262 and ASIL-requirements, the effort explodes if the software is not constructed right. We at BASELABS introduced a standardized development library to help organizations build their safety critical sensor fusion software in a flexible and scale able way for production. Our CEO Robin Schubert just started a series of articles shedding light on this sourcing strategy. You are invited to read the first one here: https://www.dhirubhai.net/pulse/underestimated-factor-your-automated-driving-software-robin-schubert

要查看或添加评论,请登录

社区洞察

其他会员也浏览了