Revolutionizing Airport Operations with Autonomous Technology

Revolutionizing Airport Operations with Autonomous Technology

A New Partnership -> AI + Machines - Dawn of the Autonomous Era

The partnership between AI & Machines has led to the dawn of Autonomous Era. The aviation sector offers numerous opportunities for innovation and serves as a significant platform for testing new technologies. One notable technology in this field is Autonomous Vehicles (AVs) and Autonomous Mobile Robots (AMRs).

We are already seeing early deployment of Autonomous Mobile Robots on the terminal side with quite a few airports around the world testing the idea of integrating passenger experience through various services offered by AMRs. Autonomous passenger shuttles can transport passengers within airport premises, including between terminals, parking lots, and other facilities. These shuttles can operate continuously, providing a reliable and efficient means of transportation that enhances the passenger experience.

Similarly, a few airports are testing autonomous vehicles on the Airside as well for use cases like perimeter monitoring, runway inspection, ground support (baggage handling) etc.

This progression reflects broader industry efforts to optimize airport operations, reduce costs, and improve passenger experiences.

Throughout this series, we will explore the transformative potential of AVs in aviation, examining their benefits, challenges, and future prospects. We will also examine the challenges that must be overcome to fully realize the potential of AVs, including regulatory hurdles, infrastructure integration, and public acceptance.


Evolution of the Autonomous Technology

Before we get to any further details about the benefits, challenges or future potential of AVs in the aviation sector, it will be worth reading about the technology itself and its evolution over due course of time.

An autonomous robot or vehicle can be seen as an intelligent machine capable of performing tasks and operating in an environment independently, without human control or intervention.

A truly autonomous machine should be able to perceive?its environment; make decisions accordingly and orchestrate a movement or interact?within that environment. These actions could involve starting, stopping or navigating across obstacles – be it stationary objects or people moving across locations such as Airport Terminal buildings or airside.

One thing to consider here is autonomy should not be confused with the robots on the assembly lines in an industrial plant because these industrial machines are pre-programmed to perform a repetitive movement and are not able to react according to situation. Imagine if one such robot responsible for pouring ketchup into bottles in a bottling plant encounters a situation where the bottle did not arrive in the slot. What do you think would happen? Most likely, the machine would continue to perform its task and end up spilling the ketchup over the assembly line. However, in a truly autonomous situation, the machine would know to stop when no bottle is available.

In terms or autonomous vehicles, there are 6 levels of driving automation ranging from Level 0 (fully manual) to Level 5 (fully autonomous) as defined by the Society of Automotive Engineers (SAE).        

These levels essentially vary in terms of differing levels of human control required for operating the vehicles going from fully manual to full automation. At each stage, the role of technology keeps becoming more and more critical as it moves from basic driver assistance to complete autonomy.



6 Levels of Autonomous Vehicles


The Technology behind Autonomous Machines

Multiple technology advancements including sensors, actuators, complex algorithms, AI/ML systems and powerful processors to run such complex models and take decisions in fractions of seconds are powering the modern autonomous vehicles.

V2X protocol, (which stands for Vehicle to Everything) encompasses a set of automotive connectivity and communication capabilities that sense and exchange data in order to ultimately make quick and error-free automated decisions and establish safety on the roads.

V2X can be further defined into two sub-categories:

  1. Vehicle to Vehicle (V2V) protocol allows for exchange and sharing of information between moving vehicles typically using short and medium range communication technologies. Some of the applications include situations like overtaking,
  2. Vehicle to Infrastructure (V2I) technology enables connected vehicles to exchange data with surrounding infrastructure. This advancing technology has the potential to coordinate driving speeds with traffic light signals, optimize fuel efficiency, and mitigate hazardous road conditions.


Source – ACI - Autonomous Vehicles and Systems at Airports

An autonomous machine (vehicle or a robot) would use a mix of some or all of the technologies listed below:

  • LIDAR sensors: LIDAR stands for LIght Detection And Ranging and is used to measure distances, detect objects in real-time using laser beams bouncing off the surroundings and creating detailed 3D maps. A laser beam is shot out and then detects objects based on how they reflect at the sensor. Most modern vehicles have 4-8 LIDAR sensors.
  • Radar Sensors: Similar to radar guns, these sensors use radio waves instead of sound waves to detect objects. Modern vehicles have 4-16 radars.
  • Vision Cameras: Modern-day vehicles can have around 10 vision cameras used to detect traffic lights, read road signs, track other vehicles, and look for pedestrians.
  • GPS: Helps to track the location, speed and provides navigation.
  • Actuators: control acceleration, braking, and steering
  • Laser Doppler Vectors: They only use infrared beams with the same principle as LIDAR sensors. They can provide a higher resolution image than radar sensors, which makes them helpful in detecting pedestrians and cyclists.
  • Ultrasonic Sensors: They detect objects using ultrasound rather than electromagnetic radiation. These are not widely used in vehicles but can be seen in some cases in wheels to detect curbs and other vehicles when parking.
  • Infrared Sensors: These sensors emit infrared light rather than radio waves.
  • Microphones: These devices are commonly used in voice-activated navigation devices.
  • Sophisticated Processors: To run the software that processes all the above sensory input, plots a path, and sends instructions to the car’s actuators


Image Source -

To better illustrate this concept, we can draw an analogy between the hardware components of an autonomous vehicle and the parts of the human body, which enable interaction with external stimuli.

These components allow the AVs to perform tasks such as Perception (through sensors like LIDAR, RADAR, Cameras, GPS, etc), Communication (via V2X technology), Decisioning (through sophisticated Processors akin to human brain) and Actuation/Moving (using actuators).

The advancements in these technologies promises more exciting times ahead with how passengers in the future will engage with AVs and AMRs and similarly how Airport operations both inside the terminal as well as airside will be impacted.

In the next article, we will explore these potential use cases in depth along with examples of what some of the airports have already tried across the world.

Siddhartha Shankhdhar

Driving Innovation & Digital Transformation | Expertise in Global Partnerships, Alliances & Sales Strategy

3 个月

Great post! The evolution of autonomous vehicles and robots is indeed transforming the aviation sector. One potential benefit of AVs in airports is the reduction of human error, which can improve safety and efficiency. Additionally, AVs can operate continuously, reducing wait times for passengers and improving their overall experience. However, regulatory hurdles and public acceptance are significant challenges that must be addressed before AVs can be fully integrated into airport operations. It will be interesting to see how airports around the world continue to innovate and test the use of AVs and AMRs in the coming years.

Jayanta Bhattacharjee

Software Development Manager at SITA UK

3 个月

Interesting

Suyog Bora

Founder & CEO , Brenin | Building AI based digital humans.

3 个月

Very insightful Sahil Gupta

要查看或添加评论,请登录

Sahil Gupta的更多文章

社区洞察

其他会员也浏览了