Distinguish Knowledge from Data when Making Decisions
In Chapter 6 of Trial, Error, and Success, we analyze an incident during Qantas flight 72 from Singapore to Perth on October 7, 2008, to illustrate the difference between data-based and knowledge-based decisions. The Airbus 330 was cruising at 37,000 feet in fine and clear weather, until the flight-control computers received the following data for the angle-of-attack (the angle between the airflow and the aircraft’s wings) from its sensor units: 2.1 degrees from the first and 50.625 degrees from the second unit, respectively.[1] The computers determined the angle of attack as the average value of 26 degrees—well above the maximum safe value of around 10 degrees—and automatically initiated the maximum nose-down pitch of the aircraft. Many passengers and cabin crew hit the celling as the aircraft dropped down 150 feet in 2 seconds. From the 303 passengers and 12 crew, 119 people suffered injuries at their spines, necks, heads, arms, and legs.
Two minutes before the pitch-down event, the flight-control computers disconnected the autopilot because they received inconsistent data about the aircraft altitude and speed. However, the computers did not give the pilots unfettered control of the flight. Instead, they started sending them brief but frequent over-speed and stall warnings. The captain was Kevin Sullivan, a former US Navy fighter pilot. He realized that something was wrong with the computers because the aircraft could not be flying at both maximum and minimum speeds at the same time. In spite of spitting this inconsistency, the flight computers did not surrender their control and the aircraft continued to fly according to Airbus’s normal control law. This meant that the computers allowed the crew to fly the aircraft only within a safe envelope of flight parameters. The angle of attack is one of the parameters that the computers monitor and automatically correct if it exceeds the safe envelope. Two minutes after the flight-control computers disconnected the autopilot, they automatically initiated the maximum nose-down pitch of the aircraft because the average value of 26 degrees was well above the maximum safe number for the angle of attack. Captain Sullivan instinctively pulled back his control stick to thwart the rapid descent, but the computers blocked his action for 2 seconds as the aircraft dropped down 150 feet and injured a third of its occupants.
It took an additional descent of 540 feet during the next 20 seconds before the pilots were able to return the aircraft back to the scheduled altitude of 37,000 feet several minutes later. As they began to respond to numerous warning messages, the aircraft dropped down again. The calculated value of the angle of attack that triggered this second pitch-down event was 16.9 degrees, according to the analysis by Australian Transport Safety Bureau.[2] Their analysis also estimated that the flight crew’s control sticks would have had no influence on the aircraft’s pitch for about 2.8 seconds. This time the aircraft descended 400 feet over 15 seconds before the flight crew was able to return it to the scheduled altitude.
Three seconds after the second pitch down, the computers disabled the automatic angle-of-attack correction by switching from normal to alternate control law. The alternate control law still didn’t give the flight crew full control of the aircraft. For this to happen, the flight-control computers have to decide to switch to Airbus’s direct law.
Considering the option for emergency landing at the nearby Royal Australian Air Force base, the captain was worried that he would not be able to stop a pitch down if it happened during landing. However, continuing toward Perth with injured passengers was even riskier. Employing a strategy that Sullivan practiced in fighter jets, the flight crew landed the Airbus 330 at the Air Force base 50 minutes after the first dive.???
Sullivan’s life and career changed forever. He took eight months off, but when he returned, he could not get rid of the concern about another potential loss of control. A year after Sullivan decided to retire, he opened his heart in an interview for the Sydney Morning Herald. Among other statements, he warned against the dark side of automation:
It is easy to blame the pilots. With all this automation now, “Well, it can't be the aeroplane—it must be the pilot.” And in a lot of times it is the pilot, because they’re confused. I was certainly confused—we were all confused on that flight. It’s a caution sign on the highway of automation to say, “Hey, can you completely remove the human input?”[3]
The Airbus’s safety philosophy was not to remove the pilots completely, but to liberate them from uncertainty and, therefore, to enhance safety.[4] Airbus’s management introduced this philosophy in the late 1980s, as they decided to take the lead in flight automation. Behind this safety philosophy is the assumption that the source of uncertainty is in the human nature of pilots, whereas the automatic control of aircrafts is completely predictable. To liberate pilots from the uncertainty of their actions, Airbus introduced computer-controlled safety envelope of critical flight parameters. That way, pilots don’t need to worry whether they would stall the aircraft by hauling back on the control stick, because the Flight Envelope Protection System will not allow the aircraft to exceed the maximum angle-of-attack number.
Most of the time, the angle-of-attack numbers and all other flight parameters are correct and the protection system works. Consequently, it has made commercial flying easier and a lot safer. This improvement in flight control has been achieved by developing better electronic sensors and by using computers that process volumes of data at speeds and with precision that are unimaginable by humans. Yet, Sullivan’s question remains relevant. As soon as he saw the warning messages about the aircraft stalling and over-speeding at the same time, he knew that something was wrong with the data rather than with the actual flight. However, computers do not interpret data to make sense what they mean in reality. In this context, it is insightful to think about the decision of Airbus’s management not to allow pilots to take full control of the aircraft until the flight-control computers surrender their envelope protection by switching to the direct control law. This decision implies a much stronger belief in objective data than in humans’ ability to interpret reality.?
To be fair to Airbus’s engineers and management, they did not ignore the fact that measured data are not perfect. That is why they use multiple independent sensors, computers, and software. They also develop algorithms to filter out measurement noise. A forensic analysis by the Australian Transport Safety Bureau showed that the Airbus’s algorithms did filter out many spikes in the angle-of-attack data during Qantas flight 72. These spikes were in the form of rapid and short changes from about 2 degrees to the false 50.625 degrees and back to the normal value of about 2 degrees. They were also limited to one of the sensor units. If the pilots could see these spikes, they would undoubtedly interpret them all as noise. In contrast, two specific combinations of these spikes passed through the flight-control algorithm, causing the automatic pitch-down events.
Airbus’s decision not to allow the pilots to take full control of the aircraft when they decide to do so has been hotly debated ever since they presented their safety philosophy. An article in the Science magazine quotes the following statement by a senior test pilot:
Nothing must have the authority to forbid the pilot to take the actions he needs to. The problem with giving away that authority to a computer is that “a computer is totally fearless—it doesn't know that it's about to hit something.”[5]
Our managers don’t like to see us making emotional decisions, so the lack of fear and other emotions may imply that computers can manage critical situations much better. However, as the quoted pilot observed, the lack of fear is because of lack of knowledge, which is different from an abundance of data. Electronic sensors can collect data that are more diverse and computers can process them faster than humans can, but an abundance of data cannot replace meaningful knowledge. When the flight-control computers on Qantas’s Airbus initiated the rapid nose-down pitch, the pilots felt the upward pressure and saw the blue Indian Ocean in front of them. This was sufficient information to know that the airplane was falling down. Aircraft sensors could collect even more detailed and precise data about the pressure and the image in front of the aircraft, in addition to many more data about the aircraft’s situation; however, the flight computers did not have the knowledge to link the data into awareness that the aircraft was heading for the Indian Ocean. The flight computers processed the data to send the over-speed and stall warnings, but they did not have the pilot’s knowledge to link these messages into the interpretation that something was wrong with the data rather than with the actual flight. The computers did not know that the rapid pitch down injured many passengers. Computers do not know that it may be necessary to break through the safe flight envelope to save passengers even if that action damages the aircraft. ?
In an apparent failure to learn from smaller errors of others, Boeing introduced a similar automatic correction of the angle of attack on the fourth generation of the most popular aircraft of all time, which they called 737 Max. According to Boeing, the aim was to “deliver improved fuel efficiency and lowest operating costs in single-aisle market.”[6] To achieve this aim, they outfitted their 737 Max 8 with bigger and more fuel-efficient engines, but the positioning of those engines increased the potential for the nose to pitch up after take-off.[7]? To counteract this risk, Boeing introduced an automated system to push the nose down when its software determines that the sensors are measuring a higher-than-acceptable angle of attack. With this system active, the pilots could no longer take manual control of the plane by simply pulling back on the control stick, as was possible on the older 737s.
Seventeen months after Boeing 737 Max 8?entered service in 2017, it had its first fatal accident, killing all 189 people on board. On October 29, 2018, Lion Air flight JT 610 departed from Jakarta and crashed 12 minutes later into the Java Sea. Less than five months later, on March 10, 2019, Ethiopian Airlines flight ET 302 crashed six minutes after its departure from Addis Ababa, killing all 157 people on board.
The managers at Boeing had ignored the common-sense point that automatic systems must not be programmed to fight against pilot’s actions. This specific knowledge doesn’t involve software expertise, but it is sufficient to conclude that computers and automation are the most useful when they work together with people, and under the ultimate authority of the people they are serving. The role of data processed by computers is not to replace but to enhance humans’ knowledge. For example, pilots need data from sensors and computers to know their spatial position because, most of the time, they cannot recognize the landscape and the oceans that they see through their windows. Think about the flight-information maps that make not only pilots but also all passengers aware of the aircraft’s position.
Knowledge is different from data, and we need both knowledge and data for awareness of the surrounding environment.
Subscribe through Substack if you wish to receive the whole chapter as a PDF file (for free).
领英推荐
Here are the titles of the remaining sections in Chapter 6 of Trial, Error, and Success:
?·?????? Keep in mind that learning can change the brain “software” but not the “hardware.”
·?????? Don’t forget that the fundamental learning method is by personal trials, errors, and successes.
·?????? Be smart to learn from failures and successes of others, but make sure to adapt the knowledge to your situation.
·?????? Enhance your knowledge to improve your consciousness.
·?????? Communicate for enhanced consciousness.
·?????? Sometimes you have to act on doubt.
[1] In-flight upset 154 km west of Learmonth, WA; 7 October 2008; VH-QPA; Airbus 330-303 (Australian Transport Safety Bureau, 2011), p. 40
[2] Ibid.; p. 80.
[3] Matt O’Sullivan, “The untold story of QF72: What happens when ‘psycho’ automation leaves pilots powerless?”, (Sydney Morning Herald, 12 May 2017; https://www.smh.com.au/lifestyle/the-untold-story-of-qf72-what-happens-when-psycho-automation-leaves-pilots-powerless-20170511-gw26ae.html)
[4] M. Mitchell Waldrop, “Flying the Electric Skies,” Science, vol. 244, pp. 1532–1534, 1989.
[5] Ibid.
[6] https://boeing.mediaroom.com/2011-08-30-Boeing-Launches-737-New-Engine-Family-with-Commitments-for-496-Airplanes-from-Five-Airlines
?
?
Griffith University | Safety Science Innovation Lab | Institute for Integrated and Intelligent System
1 年That’s a beautiful piece, Sima Dimitrijev! Thanks for sharing I didn’t have access to the full chapter but the snippet is excellent. And I agree 100%: automation should be seen as extension of human capability and not a replacement.