Your Engineering Team Needs to Learn This from the 737 MAX

Your Engineering Team Needs to Learn This from the 737 MAX

Today, there are plenty of folks placing blame on Boeing for fatal flaws in the 737 MAX. However, there is something bigger to blame: the nascent field of engineering Cyber Physical Systems (CPS). In CPS it’s easy to overlook how critical it is that the physical part (people and machines) and the cyber part (computers and software) have the same understanding of what is going on. Technically speaking, both parts need a common operating picture and set of state variables. Boeing may have made a tremendously deadly mistake, but if it wasn’t them, it would have soon been somebody else. Autonomous cars already have cases of autopilots failing in deadly ways. While cars that fail don’t kill hundreds of people in a single accident, there other massively connected systems like smart electric grids that are moving from the automated realm to true CPS. There are known scenarios that can lead to mass casualties.[1]

What exactly is a CPS? The average person hears about robots, artificial intelligence, and self-driving cars quite often. All of these are related to CPS—a term that techies have been using for years to describe things that tightly couple computers with the physical world. For example, a robot that you can talk to and helps you move a couch is a great example of a CPS. This robot communicates with you and physically interacts with you just like a human friend. It can also drop a couch on your foot if you miscommunicate: “Rotate it left!”— “My left or your left?”

CPS are much different than an automated machine like a copier or a computer. A copier may have many complicated settings and advanced features, but you don’t put your hands inside it to help it move paper once you push go. A computer may have a GUI with many layers that controls an airline’s scheduling app, but most of the computer code was previously written based on highly constrained parameters. If there is an airspace bottleneck that causes delays, the scheduler app conveniently emails stand-by crews without the need for further GUI commands. It doesn’t use its Artificial Intelligence engine to game the NextGen air traffic control system into giving it more resources, yet.[2]       

Because of the high of level of cooperative tasks and decisions in CPS, shared cognitive awareness between the cyber and physical is critical. Maybe you’d put your hands into a running copier if the copier said, “Hey, I can pause real quick after 10 sheets. If you reach in and push my pick-roller back 10 degrees every time I stop, we can keep printing while waiting on maintenance.” Similarly, if the airline scheduling app asked the GUI operator if it was ok to open a 2-way machine language chat with the FAA’s emergency-only computer, maybe other airlines wouldn’t sue you for tricking the FAA into grounding all flights but yours.

To drive home how important it is for CPS design to have a common awareness of both cyber and physical states, lets imagine a different outcome for some recent 737 Max flights.

(Smarter) Maneuvering Characteristics Augmentation System says to the pilots: “Guys, I see you’ve deployed your manual trim wheels. That is a last-resort control technique. Looks to me like we need some routine pitch down, but maybe I have some bad situational awareness, so I’m going to disconnect for a bit until we can sort this out.”  

Human factors experts have warned for years about the increasing workload put on people as the machines they interact with become more complex, especially in aircraft. However, warnings are useless without a course of action. If you are designing a CPS, what do you do? Consider what state the whole system (cyber and physical) is in, what it can do next, and how that awareness is distributed. We’ve all moved a couch at some point and had our friend say, “Let’s put it down for a second.” Maybe someone is tired or frustrated. Maybe you need to measure a doorway. Regardless, whether it’s you or your robot friend who says stop, the cyber part and the physical part need to be on the same page so the couch isn’t dropped on someone’s foot.  


[1] Imagine a smart grid in a skyscraper which fails in a way that causes and accelerates a fire.

[2] A very interesting example of a computer solving an ill-posed problem by doing what people would consider cheating. Article from Wired. Paper presented at the Neural Information Processing Systems conference in 2017.



Peter Chiou

Adopted son of the American revolution and retired USAF Colonel. Lover of classical liberal society and transformative technology leader. But in all, a sojourner on earth. Looking forward to my heavenly home.

5 年

Good article on evolving our thinking on CPS.

Mike McLendon

Associate Director, Interagency Acquisition at Software Engineering Institute

5 年

Great points!!! In this age of the silver bullet solutions like Agile and DevOos we need to remember it is about the engineering of complex CPSs.

Max Hinkley

Better Firmware, Faster | Safety-Critical Systems | Secure Coding | Lateral-thinking Integrator | Consultant

5 年

Spot on, Michael.? I think you've done an excellent job of conveying some of the important issues of CPS.? Your final paragraph makes a really important point:? How critical a CHANGE in load is.? When systems are taking most of the work, the natural human response is complacency, and lack of attention.? When there is a sudden shift, we may not be able to make the mental shift to assume control as quickly as we need to.? I think this is the riskiest aspect of so called "driver assist" systems.? I think human-override must always be available and absolute, but I also think that the system must be able to safely handle ANY changing environmental condition without requiring a human intervention.? Technology is improving at amazing rates, but I think we've still got some major hurdles to overcome before we get there. Great article!

回复

要查看或添加评论,请登录

Michael May的更多文章

社区洞察

其他会员也浏览了