Black Box Thinking - Summary Notes
Historically, healthcare when compared with aviation has a relatively poor record of learning from failure. There is a stigma associated with failure in part out of fear of the legal repercussions and a damage to the reputations of hospitals and individuals. Understanding the reasons for failure is often avoided and hidden away at great cost.
Organisational hierarchies within medicine have historically led to unnecessary failure, especially during high stakes situations as less senior medical staff tend not to challenge authority. Experience has demonstrated the during high stake situations, we can lose track of time, believing that it’s moving more slowly than it really is. While life hangs in the balance, a few minutes can make the difference between life and death.
The relative safety of airline travel is due in part to constantly evolving technology and operational systems in response to lessons learned through failure. The appointment of independent bodies to assess all accidents and share lessons learned across the industry and the fact the evidence gathered is inadmissible during court proceedings leads to greater cooperation and disclosure. If someone reports a near miss within ten days of an incident, they are protected with immunity thus further providing an opportunity to learn from potential future accidents.
Success is built upon learning from failure. It requires both the creation and continued improvement of systems base on feedback and a culture that embraces and shares lessons learned from failure. Delays in providing feedback hampers the ability to learn from failure. Without the ability to close the loop and compare the desired outcome to actual outcome learning will be slow or may not happen at all. Organisations need to build a culture that does not stigmatise failure but embraces it as a necessary source of learning.
Cognitive Dissonance - The state of having inconsistent thoughts, beliefs or attitudes, especially as related to behavioural decisions and attitude change.
Even when faced with overwhelming new evidence, people and organisations frequently do not want to admit they were wrong. Doing so would affirm their fallibility.
Much of cognitive dissonance can be explained by an individual’s desire to protect their self-esteem rather than admit they were wrong. Cognitive dissonance is more common with senior business leaders and public figures. Those that have the most to lose are least likely to admit mistakes when faced with an overwhelming body of evidence and at great personal cost.
Confirmation bias is another intellectual contortion that impedes learning. The smart approach is to develop a hypothesis based on prior evidence and then strengthen the hypothesis by looking for examples to disprove it not just build more evidence to support it.
Untested hypotheses or processes that are protected by religious, ideological or long-held beliefs as being infallible can be highly dangerous. Examples include the impact of scientific progress influenced by Marxism resulting in an impact on crop yields, loss of scientific diversity and flawed teaching practices.
Within the criminal justice system, latent issues such as the malleable nature of memories allowed for the imprinting of false memories during investigations and the susceptibility of judges to grant parole based on when they last ate, were only discovered by investigating procedures that were believed to be without fault.
Evolutionary design with rapid iteration and propagation of variations (mutations) of the most successful design can often beat purely intelligent design.
The fusion of top-down intelligent design and bottom up evolutionary design helps initialise the evolutionary design process and can often result in faster convergence.
Rigorous testing requires a Randomised Control Trial (RTC) with a control group exposed to the counterfactual. Sometimes results can be misleading in the absence of a control group especially when considering the long-term implications of intervention. In some cases, intervention can actually make a situation worse as was the case with the juvenile correction program discussed in the chapter "Scared Straight".
Marginal gains is about breaking a large problem down into smaller components in order to establish what works and what doesn’t. The cumulative effect of lots of marginal gains to small parts of a system can lead to a significant overall improvement.
Breaking a large problem down into smaller components allows us to create the next best thing to a strictly RTC when it’s not possible. However, incremental, evolution leads to plateaus that can only be overcome via large leaps or a revolution in the way of doing something.
Failure is a necessary obstacle in the creative process. It cannot be avoided, just be wrong early and often and be disciplined in your execution. Creative insight comes from observation, frustration, getting to the root cause, synthesis of ideas and disciplined experimentation. The creative process involves challenging assumptions and seeing new associations.
Over simplification of complex failure situations often leads to misappropriation of blame. This in turn can lead to a culture of avoiding failure, which in turn leads to a failure to learn and improve. Failure to develop a just culture with accountability only drives problems underground or breeds apathy.
Attribution of unjust blame can lead to significant harm. It can lead to a worsening of an environment leading to the increased likelihood of future events, disengagement, people perceiving the risks of a profession as too great and moving away or simply less resources to provide good service or care. In some cases, it can lead to post traumatic stress disorders for those associated with an event and sometimes to suicide.
Someone with a growth mindset sees failure as a necessary barrier to progress and separates the event of failure from failure of self.
True failure is not learning from failure. Research into resilience, staying power and grit show that grittier individuals are more successful than naturally more gifted (intellectual/physical) counterparts who avoid failure. We progress most rapidly when we face up to failure and learn from it.
There’s a need for organisations and cultures to redefine how they see failure if they are to learn, progress and become more creative. There’s still a stigma associated with failure and it’s inconsistent especially across cultures. One’s ego or self-esteem is possibly the biggest obstacle when it comes to failure. Some people even go to the lengths of self-handicapping, sabotage themselves and develop excuses to avoid having to admit to real failure.
For centuries, those following the scientific method needed to be careful that their findings were not in opposition to religion. This delayed progress and often fuelled the propagation of untruths. In more recent times, science and religion have been decoupled, allowing for science to pursue testing of ideas and theories and discovering truth from the bottom up without persecution. There is still significant progress to be made within the social sciences where insight, opinion and feelings are not always suitably challenged. Progress will only be made by changing how we see failure, institutional access to feedback and how we assess systems of learning. Only then, can the social sciences benefit in ways similar to the natural sciences.
In the process of reading black box thinking, but thanks for the summary
Enterprise Technologist || Cloud, Data, Cybersecurity || FSI & Quantitive Trading Tech
4 年Thanks for sharing Daniel Hand