Replacing Hindsight With Insight: Toward Better Understanding of Diagnostic Failures

Another absolute banger of a paper co-authored by one of my favourite authors – the late, great Bob Wears.

As is often the case with Bob and co’s papers, these brief 3-4 pagers are the most densely packed and most difficult to summarise. I’m using a lot of direct quotes because the language is poetic, and I think, has utility in the way concepts are expressed and framed.

This paper talks about the corrupting, but understandable, factors that lead us to focus on constructing the past via hindsight and outcome bias in diagnostic failures, and then proposes some ways forward.

Out of the gates, they observe that:

“Reviews of malpractice claims have a morbid attraction that is similar to gazing at crash scenes. Both provide the observer with a vicarious, cathartic experience”. ?[** And it’s not lost on me that I’m one of these people obsessed with studying unsafety as much as everyday work]

These stories of failure are popular because they “support a perception of control that has important psychological, social, and political benefits13 by making a complex, chaotic, and irreducibly uncertain world appear to be simpler and more linear”.

In medicine, reviews of diagnostic failures “typically find fault with the thinking or behavior of individual physicians”, e.g. physicians failed to do or observe something.

These post-hoc rationalisations “conveniently skirt the identification of other causes that have higher stakes”, like finding flaws in equipment design or processes, which would require substantial investments to resolve or “embarrassing shutdowns or retooling”.

Finding factors related to management could threaten those in charge, whereas finding that a worker had a “cognitive breakdown” may preserve the status quo, and “provides a convenient, default conclusion when no other explanation is immediately apparent (or desired)”.

It’s said that the study of diagnostic failures have not comparatively received much attention in patient safety as of this study’s publication, which they find strange but understandable because “understanding diagnosis-related failures is difficult” and slow. Reasons for being slow is that existing means have relied on “impoverished problem worlds and unrealistic models of human performance, and ignorance of the effects of hindsight bias”.

Models of Human Performance

Here they argue that many inquiries into diagnostic failures have “viewed the physician as an information-processing device that is usually flawed”, and where decisions and actions are viewed as discrete events, rather than as continuous flows of activity.

Moreover, informational cues present in the activity are said to be viewed as clearly available nuggets of objective knowledge, rather than “constructions that workers build from their own expertise and expectancies”. Physicians are also seen as individual agents working in isolation rather than as heterogeneous groups of clinicians working together.

Talking about perception and mental constructions, physicians are presumed to react to some state of the world instead of “anticipating some possible future state and acting to facilitate or forestall it”, or operating with some “hypotheticodeductive or Bayesian reasoning, rather than perception”. This model of diagnostic reasoning is not well-calibrated to what people in the real world actually do.

They argue that viewing diagnosis as a problem involving perception and sensemaking is more promising as a lens of understanding human performance, as it recognises that “real-world problems do not present themselves as givens but must instead be constructed from circumstances that are puzzling, troubling, uncertain, and possibly irrelevant”.

That is, physicians must convert these circumstances into ‘problems’ by making sense of an uncertain and disorganised set of conditions that “initially makes no sense”.

They argue that rather than being misdiagnosed, problems may instead be misperceived, explaining why those who have been labelled as having “made the wrong decision”, instead ” saw it at the time as the only reasonable one to make”. Hence, assessments and actions that in hindsight to others look like ‘errors’, were to people at the time seen as “unremarkable, routine, normal, even unnoticeable”.

Sense-making is a “perceptive act that turns circumstances into situations, combinations of circumstances at a given moment in a particular context that are imbued with meaning and can serve as a foundation for actions and expectations”.

Next they discuss the role of hindsight and outcome biases; i.e. the power of “delusional clarity”. These effects are “powerful and insidious and make it hard for historical analyses (such as root cause analysis or closed claim review) to yield useful understandings of accidents or adverse events).

With the benefit of hindsight, those who know what happened after the fact “consistently overestimate what others who lacked that knowledge could have known”. Here they say that those “who know the outcome of a complex prior history of tangled, indeterminate events, [view] that history as being much more determinant, leading ‘inevitably’ to the outcome they already know”.

Hindsight they say “converts the disjointed and disorganized array of disparate events that the participants faced into a coherent causal framework that the reviewer uses to “explain” what happened”.

Outcome bias is similar, where those who know the outcome use that to judge the quality of the process and decisions of people prior to the event, rather than evaluating those prior decisions etc.

Hindsight and outcome biases are “preconscious and cannot be overcome by simply willing ourselves to ignore them” and cautioning people “not to let outcome knowledge influence their judgment is equally ineffective”.

Interestingly, they argue that hindsight bias is “so powerful and so pervasive” that some others contend that “it must be fundamentally adaptive, arguing that our minds evolved not to understand and explain the past but to quickly and efficiently adapt to the future” (emphasis added).

Via use of these adaptive processes, making history simple “primes us for complex futures by allowing us to project simple models onto those futures”; allowing us to allay anxiety and concerns over our inability to control the uncertain future.

“Errors” they say are “like optical illusions: simultaneously convincing and false”. Errors, which can only be seen in hindsight, are “powerful ways for external observers to organize and impose structure on past events, to reconstruct a past reality”. They convert complex and confusing histories into linear and more certain/less ambiguous narratives.

Or as one physician quipped in a study cited in this paper, “. . . the errors are errors now, but they weren’t errors then.”

The authors suggest that asking questions like “How could they not have noticed” or “How could they not have known?”, arise not because “people were behaving bizarrely but rather because we (the reviewers) have chosen the wrong frame of reference to understand their behavior”.

That is, we don’t learn much by asking why people framed a problem that turned out to be wrong (e.g. misdiagnosis, or a type of error), but rather we learn more by discovering why that framing seemed so reasonable to them at the time.

Hence, making progress involves several steps, but one for physicians is providing better tools in understanding clinical practice.

First they say that we need more sophisticated and nuanced models of diagnostic reasoning, that “accord to what people actually do, rather than what we imagine should be done [and this] … requires abandoning sterile laboratory exercises in favor of studying practitioners in the real world—as it were, “in the wild.”

These approaches to work require people to be sensitive to context and the preconscious processing “that experts in a field of practice use unawares”.

Second, they say we must minimise the effects of hindsight and outcome by drawing on certain techniques. They note that there are some techniques for post hoc analyses that seek to reconstruct the world as it appeared to people at the time of the event. These reconstructions “makes it possible to understand why and how people were led into behavior that seemed so right but turned out to be wrong”. These approaches require “attending to the “messy details” of clinical work”.

Third, they say we should avoid the temptation to draw on data that was collected for different purposes. Some of this data was created to allocate responsibility or blame and may be inherently limited because the data was “carefully crafted to advance distinct social purposes and are inextricably entangled with the context of their production”.

Patient safety is said to have relied for too long on retrospective error elimination strategies, relying heavily on hindsight to identify the “causes” of “errors”. But, importantly, physicians, like others, are “condemned to live in the future,” not the past”.

Other approaches are better suited for these goals are said to draw on work ecology approaches.

Link in comments.

Authors: Wears, R. L., & Nemeth, C. P. (2007). Annals of emergency medicine, 49(2), 206-209.

Thanks for all of your work Ben Hutchinson . I am an enthusiastic novice of new view safety and your posts are slowly forming a giant aspirational to-do list! I just had to comment today because I loved that you described this paper as a banger. Your PhD supervisors must be doing something right that you are still so enthusiastic! Your next post might need to be your desert island/top 5 mixtape of new view literature ??

Patrik Lund

Social and environmental risk mitigation

1 年

Ben Hutchinson, regarding the following: “Reviews of malpractice claims have a morbid attraction that is similar to gazing at crash scenes. Both provide the observer with a vicarious, cathartic experience” – does not seem to be consistent with ignoring equipment and management failures. The first seems to be based on confidently knowing exactly what happened. Whilst the second, is based on purposely ignoring any evidence which indicates unwanted types of failures.

回复
David F.

Electrical Safety Specialist | HSEQ Generalist | 15+ yrs in Work Design & Management | Combining technical expertise & safety leadership in high-risk work.

1 年

Great read, thanks Ben. A few big words that I’ll have to look up, like hypotheticodeductive! I agree well written, does he write on the construction industry?

要查看或添加评论,请登录

Ben Hutchinson的更多文章

社区洞察

其他会员也浏览了