Experts under the microscope: the Wivenhoe Dam case
A fascinating read from Sarah Maslen and Jan Hayes, examining expert blame in the aftermath of the 2011 Queensland floods.
Decisions by expert engineers running the Wivenhoe Dam as analysed by the Commission of Inquiry (Commission) is the source of data.
There’s way too much in this paper to do it justice. But I’ll take a stab.
Providing background:
·???????? The commission evaluated the operation of the Wivenhoe Dam (outside of Brisbane), and the lead up to its release of large volumes of water from the dam. This led to the Brisbane river swelling and causing severe downstream flooding
·???????? High-stake decisions, like from engineers deciding to release large quantities of water, are typically guided by procedures. For Wivenhoe, a dam operating procedure set out the objectives and strategies for flood conditions
·???????? Experts testified during the commission that “dam engineers had achieved the best possible outcome in the difficult prevailing conditions, but this testimony was not enough to clear the engineers of responsibility for the result”
·???????? The commission instead focused on whether the engineers operated the dam in alignment with the manual during the flood event
·???????? The authors argue that this case raises questions like: who can or should have the power to judge an expert’s decisions after disasters? What does this line of reasoning accomplish for learning and prevention?
·???????? Prior decision making research has indicated that “only other experts can unassailably assess experts …[and] Rules and procedures are important and yet expert practice goes beyond simple compliance”
·???????? The authors discuss ‘Professional practice in an age of risk intolerance’. One facet of an increasingly risk intolerant society is an aggravated deference to rules and credentials, e.g. accreditations, formalised systems, continuing education etc
·???????? A recognition is necessary between knowledge transmission of expertise, like via training courses and seminars, to recognition of the tacit knowledge acquired through experience, story sharing, feedback etc. All essential for accurate and response expert decision-making
·???????? Discussing the links between performance, rules and expertise – “good expert decisions are not always the product of following procedures to the letter” (emphasis added). Rather, a skilled practitioner “consults the world, rather than … rules, propositions, [or] beliefs … for guidance on what to do next”
·???????? Others have emphasised the danger of an over reliance on procedures across industries like aviation, offshore oil & gas, and process plants
·???????? This view of expertise “sits uncomfortably in the context of risk intolerance”
·???????? Hindsight bias is another challenge in this area, as noted by Dekker “we ‘overrate the role of rule- or procedure ‘‘violations.’’ And while there is almost always a gap between work as imagined/procedures and work as done, “that gap takes on causal significance once we have a bad outcome to look at and reason back from”
·???????? Quoting Hidden from the Clapham Junction accident “There is almost no human action or decision that cannot be made to look more flawed and less sensible in the misleading light of hindsight”
·???????? This climate of risk intolerance is said to drive undesirable outcomes on professional practice. Like 1) reduction in the efficiency of quality of expert decisions, e.g. inhibiting expertise development. An example is defensive medicine, where doctors are more likely to order unnecessary and excessive tests and procedures to minimise legal liability 2) inhibited learning from incidents. Blame interferes with reporting and undermines longer-term learning
·???????? In the case of Wivenhoe, it’s noted that “blame was assigned to people who were judged by their professional peers to have done a good job”
Results
Engineers at Wivenhoe work in an environment with clear objectives and priorities, and during a flood their key objectives are to ensure the structural safety of the dam, provide optimum protection of urbanised areas from inundation, and other goals.
Managing these objectives is “no doubt, a complex and hazardous business”. For the commission, obeying the manual was the key focus.
The “apparent insistence [by the commission] that the best decisions are made by slavishly following procedures is at odds with the literature on expert decision-making and excellence in safety”. They note how highly reliable organising organisations defer to expertise, allowing for responsibility for decisions driven by the experts.
Engineers reported the view that “compliance alone is not a sufficient strategy for ensuring excellent safety outcomes in complex industries” and that, organisations engage expert engineers for professional judgement in the form of risk management “over and above a simple compliance approach”.
Further, while the manual allowed for the senior flood operations engineer on duty to apply reasonable discretion in managing the flood event, the commission interpreted this to mean that professional judgement was always mediated by the written instructions and that there should be “no direct link between expertise and action”
领英推荐
They logically point out that major decisions that effect thousands of residents shouldn’t be made with no predetermined constraints. Yet, “if the best options for dam operation in an emergency could be completely specified in advance, then making such decisions is not a specialist task and anyone could do it. Of course this is not the case”.
As argued by the Qld President of Engineers Australia, “professional engineers do not just blindly follow manuals” but they rely on years of education and practical experience, culminating in professional judgement.
Moreover, “Complex decisions, especially those made under time pressure, are multifaceted relying on experience, judgement and procedures”. Hence, a myopic focus on compliance drives a poor basis for quality decision making.
The Wivenhoe engineers were required to balance several competing objectives. For one they succeeded in their most critical role – protecting the structural integrity of the dam. Second, their objective to minimise downstream flooding was apparently not successful.
Despite the commission favouring a view that the engineers’ decision making resulted in a near perfect outcome, and downstream flooding may not have been entirely avoidable, the commission “also found that compliance with the Manual was a key factor apparently independent of the quality of the choices made and the outcome that the engineers had achieved”.
The commissioner as part of the commission was that of a non-expert. Her assessment of competencies of the engineers at the dam “appears to be that as a non-expert, her assessment of competence was an ‘upward’ judgement”. These upward judgements are made against some type of external reference, like qualifications, standards or social consensus.
However problematically, “in the case of assessing risk management ‘it is easier to audit and assess deviations from procedures or processes than understand … safety on a case-by-case basis” (emphasis added).
In contrast, peers from related fields offered their own assessments of the engineers’ decisions, being downward or horizontal judgements. These are seen as more robust, as it stems from their own expertise and understanding of the dam engineers’ roles.
Even with all of the above criticisms from the commission – the commission itself found that the manual was “confused and so compliance on the part of the engineers was difficult, if not impossible”.
The authors then discuss the problem of blame. They note “When things go wrong, blame can take on a life of its own outside of expert opinion or even common sense” and ?“It is easier to assess judgements in the context of their procedural adherence” (emphasis added).
A fallacy of counterfactual logics is believing that if rules were simply followed, then nothing bad would have happened. Hence, “it is reasoned that failures must issue from lack of diligence or from wilful (and possibly spiteful) neglect”.
Why would the commission focus on the operators and blame logics? As another author quipped, somebody always has to be at fault. Moreover, research suggests that non-experts base their views on decision quality mostly based on the outcomes (outcome bias) rather than the preceding decisions themselves.
The authors argue that “Any review of expert judgements in the wake of a disaster is therefore bound to be fundamentally problematic” and resulting investigations and reviews may, implicitly or explicitly, focus on finding a scapegoat.
The problem of assigning blame is that it “incubates a fear which, in turn, inhibits a readiness to learn from incidents …This can fundamentally shape professional practice”. As discussed earlier, defensive medicine is a result of blame and fear of liability. Some data showed how the result of medical error compensation is based more on the level of the patient’s injury than to the quality of care they received from the doctor.
Doctors, for instance, “spend time that could otherwise be spent on patient care, on building a defence against blame including the tendency to practice defensive medicine (e.g. ordering unnecessary tests) and spend excessive time on bureaucracy”.
This effect has also been documented among engineers – called defensive engineering. Some engineers noted that a lot of their work “is about covering your arse”.
While more experienced engineers may think beyond compliance, ultimately when things go wrong “expert engineers are required to defend their decisions … [and] then legal processes fall back on compliance as the required standard”.
Further, “Linking accident causality so strongly to the actions of individuals also effectively ignores research that shows accidents are fundamentally suffered by organisations, not individuals”.
In concluding, they argue that “In the aftermath of disaster, an emphasis on blame has significant potential to interrupt the goal of getting the best long-term outcomes, in the sense that it encourages professional practices that are not optimum for safety, and is a barrier to learning from incidents”.
It is pertinent that we learn as much as possible following major accidents, particularly given the role of climate change in accelerating particular disasters.
Yet, this “Commission did not create an environment in which learning was supported”.
Link in comments.
Authors: Maslen, S., & Hayes, J. (2014).?Environment Systems and Decisions,?34, 183-193.
Principal at Tetra Tech Canada| High Complexity Mine Waste, Tailings & Water Embankment Dams| Underground Geomechanics| Flood Resiliency| Hydroelectricity| USACE and USBR certified RA & FMEA Specialist in Dams & Levees|
10 个月it is a masterpiece of a summary- well done; many facets quoted in the article have touched my professional engineering practice Ben Hutchinson
Retired Partner and Board Member now Director and Special Adviser at PwC
11 个月Wonder what AI/Integrative AI might have come up with the combination of -human centred procedures(the manual and all its errors,omissions and confusions plus the good stuff) -real time information that sits outside the usual “working envelope”….and the probabilistic reactions of the “system”, -and the opportunity or not to have human override of the system? Decision Sciences points to the human limits of both predicting and reacting in complex and real time settings-would robotics with the same parameters have made a better decision(s) and with better choice architecture?The assignment of blame tends to be a function of legal process….using formal policy/procedure as sole inquiry rationale continues to be a challenge for us all
Director on QHSE at Profconsult ISM
11 个月Perfect job Ben, both authors and yours, who brought this story to the surface. The sad thing is that such investigations themselves become sources of huzard. They distort the picture of what happened, do not make it possible to draw the right conclusions from what happened, to perform the right actions. Is there any responsibility for the results of the work of such commissions? Are there any mechanisms for verifying them? Or nothing but conscience and professionalism??
Senior Dam Safety Program Engineer, DamCrest Consulting
11 个月Thanks Ben for your insights on the Wivenhoe Dam operational flooding incident and aftermath. We engineers need to study disasters and incidents to understand better the human and organizational factors that precipitate them to prevent them in the future.