The paradox of aviation safety
John Downer’s book could have been titled “Why don’t planes fall from the sky every day?” Because in it, he attempts to answer a paradox that he sets out in the early chapters. Downer argues, with precise arguments and well researched case studies, that it should be impossible, from an epistemological viewpoint, to achieve the near-perfect reliability that commercial aviation achieves. In the subsequent chapters, he demonstrates how this paradox has been achieved. Downer argues for epistemic humility, or in put simply, to be cautious over what we think we can predict, understand and control. This is refreshing counterpoint and ‘reality check’ on the limitations of metrics, Big Data and algorithmically-driven technology. It is essential that we recognise what we do not know, and probably cannot know or understand.
Theoretically Impossible
If like me, you are sceptical of the claims made about our ability to foresee every risks, understand each scenario, and objectively manage risks within complexity, then you will enjoy the first seven chapters of the book. Downer dissects many of the arguments made about our ability to objectively assess, understand and control risk within complexity. Using examples such as Fukushima and famous airline crashes, Downer demonstrates that the high-risk technologies that we rely upon are much messier, more complex and inherently vulnerable that we often assume. There are simply too many scenarios and permutations of failure to definitively assess and foresee, dare I say control, these risks. There are epistemological barriers that prevent us understanding how complex systems will perform, and the ways that they may fail.
We must respect our ignorance, recognise what we don’t know, and understand what we cannot foresee. In a complex system, we simply cannot understand how the system will interact and respond through the decomposition and analysis of individual parts. It is a wonderful critique on the illusions of objectivity and logic. Bit by bit, Downer critiques all of the empirical foundations on which advanced risk calculations are made, demonstrating that they are social constructions based on claimed safety factors that are often no more than "fatuous nonsense." But in this critique, therein lies a paradox – how has commercial aviation achieved levels of reliability and safety that evidentially demonstrate that this is possible?
Explaining the (Im)Possible
In the subsequent chapters 8-10, Downer addresses the paradox: If our ability to model and understand risk and reliability in a complex system is so flawed, why don’t planes fall from the skies every day? And this is where Downer’s level of understanding, analysis and insight comes into its own. Comparing nuclear power and commercial aviation, he highlights three differences that explain the advantage jetliners have over commercial nuclear power.
1. Scale and Standardisation
There are circa 400 civil nuclear reactors currently operating. Few of the nuclear power plants are identical, and hence each is an individual complex system with a unique design, operation and risk. Comparing the service history or incidents of a water reactor with a thermal or graphite reactor is difficult because the technologies are fundamentally different. The lack of scale alongside the lack of standardisation makes learning and transferring technological knowledge between complex nuclear power highly problematic.
Now compare the 400 active nuclear reactors with the circa 30,000 commercial in operation today, and we can see that scale enables learning. Aircraft frames and engines are also heavily standardised, so transferring learning from in-service observations and accidents becomes easier. So, here’s the argument for standardisation to improve safety outcomes.
2. Recusive Practice
Recursive practice refers to the engineering practice of leveraging insights from service experience to incrementally refine designs and understanding of the components and thier interaction within the system. This is “learning by doing,” “learning by trying’” and “learning by using.” Or put simply, the learning curve. It involves gaining feedback from everyday operations, acting on pilot feedback reports and indepth incident investigations, so the 'system' is continuously refining and improving. This is Deming’s PDSA ripped large (note not the corrupted PDCA which ISO adopted where Deming's Study was replaced with Control). Recusive practice is summarised in this quote by Wilber Wright:
If you really wish to learn, you must mount a machine and become acquainted with its tricks by actual trial.
3. Restraining Innovation
For me, this third element was the most interesting and provides a useful comparison to other high-hazard sectors that operate catastrophic technologies. Downer explains that the aviation sector could be more innovative and advance new technologies much faster, but innovation as the 737Max demonstrated, entails risk. And hence the sector is highly conservative, only adopting technology once it has been proven in military aviation. Design stability, using existing proven technology and cautiously adopting new technology, ensures the safety of commercial aviation.
This raises a question about mindsets, risk culture and accountability. The sector, Downer argues, has seen how accidents kill, not just passengers and crew, but the companies themselves. Look to the annals of aviation history to the two Comet crashes that killed De Havilland or the loss of the DC-10 that ended McDonnel Douglas. And this gets to the interesting punchline – its not regulation that ultimately ensures plane makers act prudently, but self-regulation driven by self-interest. Its survival of the safest.
Now if at this point you’re screaming loudly “Boeing.” Well, worry not because the author unpacks what has gone wrong within that organisation in chapter 11, which opens with this great quote by James Fraze. The exploration of Boeing is very interesting.
领英推荐
When it comes to profit over safety, profit usually wins.
A Rational Accident?
And finally, to the unanswered question: What is a rational accident? A rational accident will challenge common engineering understandings and theories about the world. Downer defines them as "accidents that occur because a technological understanding proves to be unsound, even though there were rational reasons to hold that understanding before (although not after) the event." Unlike Perrow's Normal Accidents, which arise at the system level from the indeterminacies of interactions between elements, rational accidents can arise from uncertainties embedded in a single element. They are inherently unpreventable and unpredictable.
An investigations into the cause of a rational accident would find unproven assumptions underpinning the flawed system’s design.
Summary
Ultimately, this book is rallying cry for caution and a need to recognise what we don't know and do not understand. It calls for epistemic humility - we must respect our ignorance.
This was an enjoyable read and I learnt a lot. There is a lot within this book that I have not covered. For example, some interesting discussion on the role of regulation in complex systems, along with interesting discussion on regulatory capture. I'd recommend it to anyone interested in disaster management, risk studies and safety science. Even if you are not a 'safety nerd' like me, it’s an interesting, enjoyable and informative book.
I hope your found my summary useful. As always, I learn a lot from other people's views on the topic or the book, so please share your thoughts and reflections.
Finally, I apologise to the author if I have misunderstood or misinterpreted his thinking.
Links and further information
?? Purchase ‘Rational Accidents: Reckoning with Catastrophic Technologies’ here: https://www.penguin.com.au/books/rational-accidents-9780262546997 . You can access an free downloadable PDF edition of the book here: https://direct.mit.edu/books/oa-monograph/5714/Rational-AccidentsReckoning-with-Catastrophic . I'm old school and prefer print copy which I routinely deface and appear to destroy while reading, as you can see below.
?? And here is an article by the author discussing the themes covered in his book: https://thereader.mitpress.mit.edu/what-boeings-door-plug-debacle-says-about-the-future-of-aviation-safety/
?? Watch Dr John Downer compare safety within civil nuclear and civil aviation, and what can be learnt from both in this National Academies of Sciences, Engineering, and Medicine workshop: https://www.youtube.com/watch?v=I1C-pzSk8GM
Director, Verda Consulting. Supporting your business through enhanced safety performance.
8 个月Aloha flight 243 was the result of a badly managed known issue 1. The debonding of the skins was a known issue. 2. The aircraft was flying a high cycles programme thus prone to fatigue yet the maintenance programme was based on hours. 3. The aircraft was inspected for fatigue cracks the night prior as per a bulletin for the known issue. 4. The maintenance operation was dysfunctional in its approach to detect such cracks. It may have been a complex accident but the airworthiness and maintenance requirements to prevent this accident were more or less in place, they were not followed. Were they as mature as they are today....no, and things changed from this event but to say it was not predictable is not accurate.
Associate Director - Fire Engineering at Ashton Fire and Chartered Engineer
8 个月I think when people are using Boeing as an example about the above issues in Aviation, they need to look into what went on there over successive years. I'm sorry but it is mostly a corporate/ commercial centric issue, not one of regulation or the wider aviation industry.
Sr. Policy Analyst: Systems Thinking and Wellbeing Leader || blending Art and Science
8 个月"An investigations into the cause of a rational accident would find unproven assumptions underpinning the flawed system’s design." While safety is discussed in terms of a physical technology here the lessons may be applicable to the parallel universes of gleeful innovation tracts including chemicals that are assumed safe without due analysis or disclosure that expose millions and contaminate soil and water, and for engineered food stuff that turn entire populations into an experiment without consent et cetera. "When it comes to profit over safety, profit usually wins."
Senior Manager | Strategic Operations Leader | 16+ Years Shaping Excellence in Insurance & Mortgage| Driving Innovation, Efficiency, and Team Success
8 个月James Pomeroy, your reflection on John Downer's book is truly enlightening. The detailed exploration of aviation safety challenges and achievements offers a unique perspective on the complexities of ensuring safety in critical technologies. Thank you for sharing these insights.
Senior Social Strategist Extractive Industry / Business Development
8 个月Andy Reynolds