Managing Risk in Complexity
Challenging perceptions
This article explores Gary Klein’s challenge that risk management does not work for complex risks. He contests the idea that we can systematically foresee and manage risks within a complex system. Most people accept that risks can be predicted, analysed and managed, regardless of the context. Klein challenges this belief in his book "Streetlights and Shadows". He wrote his critique over a decade ago, well before the pandemic and other recent systemic and complex risk events. Although many of these arguments he makes are not new (and are the foundations of resilience engineering and a systems approach), with the increasing claims that technology can predict and forecast safety risks, Klein’s critique is worth revisiting.
Simple versus complex
Firstly, it’s worth noting that Klein’s critique is focused on complex systems. The cases studies provided cover a variety of complex environments, including fire within a high-rise building, a failure of an automotive supply chain, the failure of a casino, failures in military planning and the Columbia space shuttle explosion. Although Klein does not provide his definition of a complex system, he is referring to a socio-technical system where there is high interaction and interdependency between the different parts of the system. This could also include situations where risks are novel and we face situations which are unpredictable, where we have no prior experience.
Dangerous and delusional
Klein does not wholeheartedly reject the identification of risk or planning. He challenges the belief that all the significant risks within a complex system can be foreseen, analysed and mitigated. This he argues, is dangerous and delusional, lulling organizations into a false sense of security that they have considered all eventualities and risks are controlled. He points to the paradox of managing risks in complexity: “Organizations that try to eliminate risks are playing so as not to lose, which increases the chance that they will lose.” Rather than placing so much focus on accurately identifying, assessing and managing the risks at the outset, Klien argues for agility, capacity and resilience to respond to risks that arise. We should reject “mechanical strategies” that seek to identify, quantify and prioritise risks, but rather enable managers to understand and make sense of situations, so they are alert to respond and adapt to emergent problems.
Predicting risks in complexity
Distinguishing between situations that are simple and familiar versus complex and novel, Klein reminds us that the prophecy of risk works in well-ordered environments that we are experienced but is problematic in complexity. We have a misplaced faith in our ability to identify risks. The most devastating risks in complexity are those that no one expected or considered realistic. This aligns to Taleb's Black Swans concept.
Similar to Anthony Hidden’s cautionary words on the misleading light of hindsight, Klein’s critique of our ability to predict risk in complexity is a useful reminder of the pitfalls of retrospectively arguing that a risk should have been foreseen and migrated. In complexity, the “obvious” clues and signs are only ever clear to us after the event.
I think its quite useful to distinguish between identifying a risk and then predicting how the scenario will unfold. For example, the risk of a pandemic featured on organizational risk registers for decades before COVID-19 and many firms developed plans. But I have yet to encounter anyone who foresaw how the pandemic would unfold, such as the use and effect of lockdowns, remote working, international travel restristions, etc., and the secondary effects that followed. Many of us can predict a risk is significant, but the ability to model the scenario in a complex system requires Prometheus levels of foresight.
Assessing risks
Central to Klein’s critique of assessing risk is the suggestion that risks within a socio-technical system can be decomposed into the component parts. Where there is high interaction and interdependency between the different parts of the system, risk cannot be examined in isolation. The situation is compounded when a system is not just inherently complex but it the situation is novel and we have little experience. This is where the questioning must reverse and focus on what we don’t know, what we are uncertain of and where we have doubt.
Klein’s second argument against risk assessment is one I’ve seen often: the delusion of numbers. When a complex risk gets simplified to a number and colour, sensemaking can quickly be lost. We do not adequately account for the delusion of numeracy and how numerical estimates contributes to overconfidence. I fully subscribe to this.
A third critique picks up on Taleb’s Black Swan metaphor but goes further. It’s not just that we cannot predict catastrophic events that are beyond what is normally expected, but even when a Black Swan risk appears in front of us, we cannot comprehend the threat, so we rationalise it, deny it and discounting it until it is too late. Anyone who has studied a disaster will surely recognise the “risk blindness” described here.
Risk plans
In questioning planning, Klein draws one of the paradoxes of managing risk in complexity – in unpredictable environments, the risk defences can become part of the problem. Once a plan is developed, this can contribute to a false sense of security that complexity has been mastered. In addition to leading to optimism bias, plans frequently fail to adequately consider human behaviour. These arguments draw on the work of Henry Mintzberg, Karl Weick and Kathleen Sutcliffe who have pointed to the downsides of planning which are summarised below. In the image below, I have summarised some of the arguments. Weick and Sutcliff summarise the paradox of planning: “plans create mindlessness instead of mindful anticipation of the unexpected”. This is one of my favourate lines of the delusion of having a plan, reminding me of the infamous line by boxer Mike Tyson: "everyone has a plan until..."
领英推荐
To be clear, this not an argument against planning, but a reminder of the unintended consequences and limitations of plans which we often do not recognize. The desire to define and document for every eventuality is so prevalent within safety and risk management, that the downsides are rarely recognized. Moreover, it can appear counterintuitive that there creating risk mitigation plans may actually increase risk by blinding and distracting us to the novel, emerging and unrecognized threat. Klien also highlights that the imposition of procedures and constraints with complex risks can often reduce flexibility and agility, which in turn makes organizations less able to adapt and respond. I have included some links below for those interested in learning more about this paradox.
Monitoring risks
Klein’s critique of risk monitoring is two-fold. Monitoring creates a bureaucratic process that becomes self-serving with processes that take people away from understanding the risk. The monitoring processes become the goal, which many will probably recognize. In complexity, this creates distraction where the artifacts of monitoring, which such as analysis and metric can further reinforce the illusion of control.
Concluding points
Summarizing the key arguments against applying conventional risk management in complexity:
Dr Klein concludes by offering an alternative statement: We should cope with risk in complex situations by relying on resilience engineering rather than attempting to identify and prevent risks.
I find his critique very interesting and am persuaded by many of the points he makes. What do you think?
Some further reading:
Gary Klein's "Streetlights and shadows"
Henry Mintzberg’s "Fallacy of Pre-determination"
Karl Weick and Kathleen Sutcliffe’s “Managing The Unexpected: Assuring High Performance in an Age of Complexity”
?
Senior Manager - Governance, Risk, Safety & Compliance
11 个月Appreciate the analysis James. I’ll have a deeper dive into Klein’s work to get a closer perspective, but my initial take from your article is that it appears to look at risk in complex systems/orgs as a static analysis that attempts to eliminate or mitigate the hazard and move on with operations. I may well be wrong, and no doubt the deeper read will add nuance. There’s fundamental tenants to each stage of risk assessment, not the least of which is monitoring controls for effectiveness and that all hazards were identified. It’s an iterative process that relies on continually studying its own effectiveness. Where we allocate a control, the control owner has to genuinely be appropriate to the risk, and they have to have sufficient training, knowledge, experience etc in order to be able to monitor its effectiveness, rather than passing the buck to safety personnel to include control reviews in a generic safety inspection. Again, thanks for another considered and stimulating post!
Safety & Risk Executive | Human & Organizational Performance
11 个月Really insightful - agree that we, (industry in general) have a tendency to oversimplify through the application of risk assessment tools and matrices because it provides comfort in having 'addressed the issue'. As Ruud mentions below, the major challenge is recognizing when we are dealing with simple, complicated, complex or chaotic systems and applying an appropriate level of analysis/resilience.
LATAM Regional Safety & Security Risks
12 个月I agree. Also considering the unavoidable chaotic scenario that offer the risk handling as a "wicked problem", and the challenge coming from the constructivist focus opposite to traditional positivism. Thanks for sharing.?
Author of 'Enterprise Architecture Fundamentals', Founder & Owner of Caminao
12 个月Risk management can be best achieved when framed by the kind of cognitive capacity involved (observation, reasoning, judgment, experience) and the reliability of corresponding resources (data, information, knowledge). https://caminao.blog/overview/knowledge-kaleidoscope/llms-the-matter-of-regulations/
Professor of Process Safety
12 个月Part 1/3 Francis Fukuyama and Mark Twain Interesting and thought provoking! If we accept it is not a zero-sum game (as David Slater helpfully points out) then perhaps the message is not so much that 'risk management does not work for complex risks' (a piece of click-bait like announcing the end of history) but that it is just one of a suite of tools. My experience is that everything is simple to those who don’t have to do it. All technology used by humans is socio-technological – people designed the hardware, programmed the software and when people use it they will eventually do so in ways never envisaged by the designers. ? What looks like a well-ordered environment is often held together by a few key individuals, and just a resignation or retirement away from risky chaos unless there is a systematic risk management process underpinning the organisation. Rather than rejecting out of hand ?‘“mechanical strategies” that seek to identify, quantify and prioritise risks,’ we should use analysis as the foundation for risk control. Trained and capable people can then build and maintain a truly adaptable and resilient risk management structure on that foundation. Part 2/3 David Rumsfeld and Erik Hollnagel