Another Deadly?Boeing Design Error
Sten Vesterli
I help business and IT leaders chart a safe course through the minefields of technology.
It turns out that the defective thinking behind the two deadly 737 MAX 8 crashes started killing people more than a decade ago. The Dutch Safety Board commissioned a report on the human factors involved in the crash of Turkish Airlines Flight 1951 back in 2009, but at the insistence of Boeing and the FAA, that report's conclusions never made it into the official crash report.
On the doomed flight, the crew knew that the left-hand radar altimeter was defective, so they had selected the right-hand Flight Control Computer (FCC) during landing. The instruction manual and pilot training says:
- One of the FCCs is specified as the master FCC
- Each FCC continues to calculate thrust, pitch and roll commands
- The autothrottle adjusts the thrust levers with commands from the FCC
- Two independent radio altimeters provide radio altitude to the respective FCCs
This sounds like a well-designed symmetrical system that will fly equally well on either FCC.
However, unbeknownst to the pilots, the autothrottle only gets its input from the left FCC. The data that the computer had to work on came from the defective altimeter and suddenly changed from 1950 feet to minus 8 feet. So the autothrottle thought the plane was landing and pulled the power back to idle. But the autopilot had the right data and continued to try to fly the aircraft to the airport. By the time the pilots realized the problem, they didn't have time to correct it. And the plane crashed.
There are some chilling parallels: The computer gets defective data, the pilots don't know how the computer works, they can't intervene correctly, and people die.
A techno-arrogance similar to Boeing's is apparent in the Artificial Intelligence field. Technologists will happily promote solutions that have taught themselves, even though no human has any clue about how they work. Users are pushing back against opaque systems that simply provide an inexplicable answer. Don't try to implement unexplainable systems that your users don't understand.
This post originally appeared in the Technology That Fits newsletter. Don't miss the next one, sign up.
Business Strategy Consultant | Independent ERP Expert | Supply Chain Specialist | Advisor | Author | Speaker | Business Commentator
4 年Sten, I particularly like this paragraph; “A techno-arrogance similar to Boeing's is apparent in the Artificial Intelligence field. Technologists will happily promote solutions that have taught themselves, even though no human has any clue about how they work.” This is becoming extraordinarily relevant in the #ERP space at the moment. It is even more important now that clients have sufficient knowledge transfer to be self sufficient in this area. They need to understand how this works.
Senior Oracle DBA
4 年It wasn't the design that killed people. It was the decision to not train pilots properly. Save money, lose lives.
Staff Engineer Software at Peraton
4 年Don't they teach Garbage In Garbage Out anymore?
Research Associate at Naval Postgraduate School
4 年It would be better if this were a public owned entity, more accountability.
It-nerd at heart
4 年Another learning might also be that software can't beat physics ...