Another Deadly?Boeing Design Error
By Radio Nederland Wereldomroep / Fred Vloo - originally posted?to Flickr as Crash Turkish Airlines TK 1951, CC BY 2.0

Another Deadly?Boeing Design Error

It turns out that the defective thinking behind the two deadly 737 MAX 8 crashes started killing people more than a decade ago. The Dutch Safety Board commissioned a report on the human factors involved in the crash of Turkish Airlines Flight 1951 back in 2009, but at the insistence of Boeing and the FAA, that report's conclusions never made it into the official crash report. 

On the doomed flight, the crew knew that the left-hand radar altimeter was defective, so they had selected the right-hand Flight Control Computer (FCC) during landing. The instruction manual and pilot training says:

  • One of the FCCs is specified as the master FCC
  • Each FCC continues to calculate thrust, pitch and roll commands
  • The autothrottle adjusts the thrust levers with commands from the FCC
  • Two independent radio altimeters provide radio altitude to the respective FCCs

This sounds like a well-designed symmetrical system that will fly equally well on either FCC.

However, unbeknownst to the pilots, the autothrottle only gets its input from the left FCC. The data that the computer had to work on came from the defective altimeter and suddenly changed from 1950 feet to minus 8 feet. So the autothrottle thought the plane was landing and pulled the power back to idle. But the autopilot had the right data and continued to try to fly the aircraft to the airport. By the time the pilots realized the problem, they didn't have time to correct it. And the plane crashed.

There are some chilling parallels: The computer gets defective data, the pilots don't know how the computer works, they can't intervene correctly, and people die.

A techno-arrogance similar to Boeing's is apparent in the Artificial Intelligence field. Technologists will happily promote solutions that have taught themselves, even though no human has any clue about how they work. Users are pushing back against opaque systems that simply provide an inexplicable answer. Don't try to implement unexplainable systems that your users don't understand.


This post originally appeared in the Technology That Fits newsletter. Don't miss the next one, sign up.

David Ogilvie

Business Strategy Consultant | Independent ERP Expert | Supply Chain Specialist | Advisor | Author | Speaker | Business Commentator

4 年

Sten, I particularly like this paragraph; “A techno-arrogance similar to Boeing's is apparent in the Artificial Intelligence field. Technologists will happily promote solutions that have taught themselves, even though no human has any clue about how they work.” This is becoming extraordinarily relevant in the #ERP space at the moment. It is even more important now that clients have sufficient knowledge transfer to be self sufficient in this area. They need to understand how this works.

回复
Janne Isom?ki

Senior Oracle DBA

4 年

It wasn't the design that killed people. It was the decision to not train pilots properly. Save money, lose lives.

回复
John Flack

Staff Engineer Software at Peraton

4 年

Don't they teach Garbage In Garbage Out anymore?

Arijit Das

Research Associate at Naval Postgraduate School

4 年

It would be better if this were a public owned entity, more accountability.

回复

Another learning might also be that software can't beat physics ...

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了