Eyes, Ears and Nose of Autonomous Vehicles
Angad Singi
Building GhostVideos - personal branding for founders, without the hassle. Done-for-you videos delivered in 24 hours ??
Does any of you remember the Lexus 2054 from the movie ‘Minority Report’ (2002)?
In the film, the Lexus 2054 is part of a networked transit system that encompasses all cars on the road. With a colour-changing paint job, the Lexus features works as an autonomous car with a manual-override system allowing the driver to take back control of the vehicle, seemingly an excellent compromise to share duty between human and machine.
Though in 2002, it seemed like a SciFi imagination and dream. We are closer to Self Driven Autonomous Vehicles. Some companies and enthusiasts envision a Level-5 autonomous vehicle will be widespread on the road by 2025.
I am no expert or a fortune teller to precisely know the timeline, but I believe that autonomous vehicles are coming to us we like it or not. Driving license will soon become a thing of the past, and we will only have a hobby or passionate drivers. The future of personal transport will be very safe, and we will have very, very few accidents.
So I am taking this opportunity to write a series of articles explaining the working of autonomous vehicles and different tech stacks and models multiples companies are working with.
But before diving deep into the working of autonomous vehicles, let’s explore how a machine senses the environment in this article. Excited?
As humans have eyes to see, nose to smell, ears to hear, tongue to taste and skin to feel the texture; similarly, machines need sensors to sense what is going around them.
Currently, we do the sensing and decision-taking part and pass the commands in the form of pressing a pedal or steering the wheels to the vehicle.
What does a vehicle or a machine use to sense the environment — Cameras, LiDAR and Radar with a bunch of other sensors. These are the primary sensory organs of all the machines.
Cameras
Cameras are widely used and mature technology.
If machines were to drive like humans, they ought to see like humans, and therefore visual recognition of the objects is the way to go.
Cameras provide videos which are then broken down into singular frames. These frames are then analysed by CNN (Convolutional Neural Networks), a deep learning algorithm. CNN identifies the objects in the images and provides information to the vehicle's decision-making module about the environment while it is driving. This helps the car avoid road collisions, safely make lane changes, and even read the signboards' text signs.
Pros of Cameras
- Reliable and cheap to produce.
- Cameras can detect RGB information, i.e. they can sense colours.
- With infrared lighting, they can be used at night as well.
Cons of Cameras
- Sunlight and bad weather conditions like heavy rains and snowfall can blind the cameras.
- Cameras lack precision in-depth and range sensing. They can see the object but cannot precisely tell how far or how big the object is.
LiDAR
LiDAR stands for light detection and ranging and uses pulses of light to detect objects, much like how radar works using radio waves. These pulses can determine the distance and range of an object, providing much-needed data to self-driving cars. LiDAR system sends thousands of pulses every second to create a 3D point cloud map of the surrounding.
Pros of LiDARS
- LiDAR can spot objects up to 100 meters or so away and measure distances at an accuracy of up to 2cm.
- LiDAR is also unaffected by adverse weather conditions such as wind, rain and snow.
Cons of LiDARS
- It takes a considerable amount of processing power to interpret up to a million measurements every second and then translate them into actionable data.
- Lidar sensors are also complex, with many relying on moving parts, making them more vulnerable to damage.
- LiDAR is affected by wavelength stability and detector sensitivity. The laser’s wavelength can be affected by variations in temperature, while poor SNR (Signal-to-Noise Ratio) affects the LiDAR detector's sensors.
- LiDAR is expensive, bulky and not aesthetically pleasing.
- LiDAR cannot visually recognise an object and therefore always used with the camera.
Radars
RADAR stands for Radio Detection And Ranging. It works by emitting electromagnetic (EM) waves that reflect when they meet an obstacle. Since it works using EM waves, it can work under any condition.
FMCW - Frequency-Modulated Continuous Wave is one of the most popular radar and is abundantly used in vehicles.
Pros of Radars
- Radar is impervious to adverse weather conditions, working reliably in dark, rainy, or even foggy conditions.
- The radar is cheap and can be installed discreetly.
Cons of Radars
- Current 24GHz sensors can offer a limited resolution only – they let the car ‘see’ the world, but in reality, the picture painted is somewhat blurry and noisy.
- Radars also cannot visually recognise an object and therefore always used with the camera.
The development of more accurate 77GHz ‘mmWave’ radar sensors should help reduce the blur, detecting speed changes with more precision when measuring distance than 24GHz sensors. 77GHz sensors are also comparatively smaller.
Read more about the functioning of Radars here.
Conclusion
With each sensor having its advantages and disadvantages, autonomous vehicles are unlikely to rely on just one system to sense the environment. I believe it has to be an intelligent combination of these sensors and others to easily view and navigate the world.
In the following article, I shall explain LiDAR-based autonomous vehicles' working, i.e. all the companies except Tesla.
You can read more about the sensors in the following links here, here, here, here, here and here.
Less is more
3 年Nice article. How is your ‘autonomous’ project getting along?