5G NR rel-16 high band automotive localization and implementations for Smart Cities
Barak Rosenberg
I am an accomplished AI specialist, founder, entrepreneur, manager, programmer, architect. With a rich background in the hi tech industries, management, architecture, programming, AI/ML (including LLM edge/cloud).
There are two dramatically new revolutions in the automotive field:
- The Electric Vehicle (EV) revolution that strives to change all vehicles from ICE(Internal Combustion) engine into much more effective, silent, and efficient vehicle, this revolution has already started and increases each and every day.
- The safety revolution that strives to make transportation safer, smarter, more effective, more reliable, more convenient and with less congestion.
We are going to write only on the safety revolution: The purpose of adding visual sensors to the vehicle for the purpose of understanding the surrounding around the vehicle, and how it can be done with a totally new revolutionary way using Smart City infrastructure that includes: Localization, HD-maps and V2X to gain this Situational Awareness meaning understanding the surrounding around the vehicle, or what we see here:
What we can see in this film is SpearEye's real display of a dynamic HD-map with real lane coordinates that consists of ~27,000 lane segments and ~900 links/roads in Berlin/Germany that includes lanes with several moving vehicle that calculate their own location and broadcast this location which also called localization to other vehicles in their vicinity to understand their environment around the vehicle in real time, this system that can be called an upgraded Google maps/Waze/HERE WeGO with lanes and multiple dynamic objects on top of it, in real time, updated every 50millseconds, while system can be in each and every vehicle to give the vehicle the Situational Awareness needed for next stages of safety applications.
The Current method of gaining safety using visual sensors(cameras, radars, lidars) with AI capabilities to have location awareness around the vehicle:
AI uses neural networks which consists of layers of "neurons" with data that is supplied to the network as input, the "weights" in every "neuron" is fitted to give the desired result from the neural network in an algorithm that is called back propagation, there may be other algorithms that work in same way, of supplying a defined input to the network and "fixing" the network to fit the desired needed results. The problem with these algorithms is that don't have intuition and are trained for specific input, that give 70-80% accuracy in "recognizing" general objects like a "car" without the notion what is a real car or a human in every size, every kind of light and in every color how a human brain does. Please read: https://en.wikipedia.org/wiki/Backpropagation , this is called training a network, but not learning, and can not recognize lanes on roads that simply do not exist, for example in junctions, see beyond line of sight, or when there are light problems, or in bad weather, any way the level of recognizing in general 70-80% is not enough in automotive field.
The visual sensors that are used, that being tested are:
- Cameras: use AI to recognize objects, work badly in direct sunlight or lack of light or in bad weather, have problems with distance to objects that are recognized, require special hardware, work mostly in static places and in good light conditions, and several cameras are needed to cover the whole surrounding around the vehicle.
- Radars: don't require AI, work in high bandwidth typically 76-81Ghz. Are great for exact measurement of distance to the object ahead, work in every weather, can be used for collision detection or holding distance from vehicle ahead like in adaptive cruise control, require several different radars to cover the whole surrounding around the vehicle.
- Ultra Sonic: don't require AI, mostly uses 40-70Khz frequency, used in short proximity from vehicle, for exact distance from obstacles with accuracy of 1cm, mostly used for parking, measuring very close proximity up to 11 meters
- Lidars: uses precise laser technology with AI, to send laser beams and calculate accurate distances from specific points, have problems with accuracy in bad weather like rain, currently is expensive, and required to be outside the vehicle to perceive it's environment in 360 degrees, which make it even more vulnerable to bad weather conditions, dirt and vandalism. Can be used in good weather to create HD-maps.
example: Integration of radars and Lidar, uses AI and GPU, depends on good weather, works only within Line Of Sight.
Another example: This configuration relies on visual sensors with localization from GPS, Lidar depends on weather, and needs to be cleaned, there is a need for several cameras to try to understand the surrounding around the vehicle. We can see the costs and sophistication of such a system, in the "old" way.
The new method for gaining safety using:
1. Localization using 5G high band NR ground antennas(26Ghz (EMEA,US),28Ghz (APAC)) that work in any weather, without depending on clouds, give location accuracy of <1 Inch/1-2CM.
2. Location Awareness: understanding the surrounding around the vehicle using HD-map tiles with broadcasting capabilities of C-V2X modem, with the help of radars and ultrasonic sensors that don't depend on weather to avoid collision with close objects that don't have V2X modem(in mixed environments).
3. Safety applications: collision detection, testing speed and azimuth of all vehicles around host vehicle in ~1Km radius, autonomy, platooning.
1. Localization:
In Localization we mean calculating the most accurate positioning for longitude on earth ,latitude on earth,speed in m/s,heading(azimuth compared to north pole) of host vehicle on earth. GPS systems use WGS84 system, please read: https://en.wikipedia.org/wiki/Geographic_coordinate_system. On accuracy of localization, please read: https://wiki.gis.com/wiki/index.php/Decimal_degrees .
The 5G high band 26/28Ghz frequency NR technology in automotive for localization uses 5G ground antennas base stations, depending on 5G antennas density, gives 1-2CM(<1Inch) of accuracy in any weather.
The purpose of localization is to use 5G high band spectrum, 26/28Ghz range, with up to 800Mhz bandwidth, distance between base stations should be <1500feet or 450meters, preferably 150-250 meters on roads to bring <10ms latency- preferably ~1ms, and 6th digit accuracy in WGS84 format, for example: longitude: 13.347624, latitiude: 52.530670 in Berlin/Germany, which represents accuracy of location that is desired. This bandwidth that consumes minimum electricity, is much safer, since the waves are weak and fragile.
High Frequency 5G bandwidth is installed as base stations, mostly on light poles, works in every weather, has trouble penetrating walls and glasses into houses and buldings, can be used for homes for the purposes of high definition video conferencing, VR, video games in 4K/8K with latency of ~1ms, without any need of WiFi routers at home or fiber optic cables into homes.
The technology for geometrical localization are: triangulation and trilateration using time of arrival of packets or RSS signal strength.
The 5G NR high bandwidth 26/28Ghz localization technology is the only technology that works in any weather, is not obstructed by clouds and is already part of 5G communication. SLAM technology that requires HD-maps with Lidar and sensors to integrate localization with HD-map coordinates to move the vehicle, depends on sophisticated AI and good weather, and is very far from the capabilities of 5G localization with HD-map edge computing, for the purposes of safety/autonomy/platooning.
Once we use the 5G high band for localization, then the real "magic" begins, integrating localization with broadcasting capabilities and HD-map accurate lanes.
The next post will examine the automotive Location Awareness and safety applications capabilities that are developed by SpearEye using 5G Localization/C-V2X capabilities.
Strategy Expert & Marketing Executor | Chief of Change @ Kremer Ventures
3 年Hanoch Kremer
CEO, V2ROADS
3 年Good article. I would probably add a bit more about V2x collective perception when the vehicle gets the information locally from other vehicles over V2V and from road infrastructure over V2I. Just idea