Investigators are human too: outcome bias and perceptions of individual culpability in patient safety incident investigations
?This study explored whether outcome bias might explain why healthcare investigations focus on individual culpability over addressing latent conditions in the system.
212 participants were allocated to one of three scenarios followed by the findings of an investigation (see scenario overviews below).
For background:
·???????? Prior work has identified that the “overwhelming majority of recommendations developed following serious incident investigations would be categorised as ‘weak’”
·???????? Recommendations often focus on the behaviours of individuals, like reminders, training or rewriting procedures over addressing latent conditions
·???????? It’s said that the “patient safety movement has struggled to shift the focus from people to systems; and this may be a reason why we are still not ‘learning’ from patient safety investigations”
·???????? Investigations can themselves “compound or add harm to those involved or affected by the incident, investigation or subsequent recommendations”
·???????? There is a body of evidence that judgements/attributions of individual responsibility or culpability are influenced by the outcome of an incident
Results
Key findings were that:
·???????? “Worsening patient outcome was associated with increased judgements of staff responsibility for causing the incident as well as greater motivation to investigate”
·???????? “More participants selected punitive recommendations when patient outcome was worse”
·???????? “While avoidability did not appear to be associated with patient outcome, ratings were high suggesting participants always considered incidents to be highly avoidable”
·???????? People with patient safety expertise were less likely to demonstrate these attributions, but still did to some extent
·???????? “Outcome bias has a significant impact on judgements following incidents and investigations and may contribute to the continued focus on individual culpability and individual focused recommendations observed following investigations”
Discussing the results, it’s said that outcome knowledge is linked with changes in how individuals judge incidents and this is, to an extent, irrespective of their background or previous experiences with incidents.
They note that some of the absolute effects were statistically small. But nevertheless, “we propose even small effects on responses could have a significant impact on patient safety investigations”
?
Increasing outcome severity is associated with increased judgements of responsibility but not avoidability:
There more severe the outcome, the greater responsibility assigned to the staff involved. This outcome-knowledge bias on responsibility has been demonstrated in other research too. Avoidability wasn’t found to be an influencing factor, but participants believed that all of the scenarios were avoidable anyway.
They said that hindsight bias and outcome bias both involve “the projection of new information into the evaluation of past events or actions”, but hindsight bias involves “the denial that outcome information has influenced judgements”.
?
Increasing outcome severity is associated with increased judgements of importance to investigate:
People believed that the more severe the incident, the more it needed to be investigated. Hence, the level of harm was a key determinant for what gets investigated.
?
Increasing outcome severity is not associated with more recommendations but is associated with more punitive recommendations:
The number of recommendations did not vary between the levels of harm or participant groups. However, there was a tendency to make recommendations than not. They cite a study suggesting that people prefer additive change over subtractive change, “for example, adding a checklist rather than removing one”.
It’s implied that perhaps investigations “contribute to the creation of safety clutter or low-value safety”.
Also, knowledge of a severe outcome biases recommendations towards punitive recommendations.
?
Expertise in patient safety can reduce some biases:
While expertise appears to mitigate perceptions of responsibility and reduce attributions of punitive recommendations, it doesn’t fully mitigate the effect.
They observed that “When the patient outcome was death, the proportion of experts and staff selecting punitive recommendations doubled”.
?
Ref: Lea, W., Budworth, L., O'Hara, J., Vincent, C., & Lawton, R. (2025). Investigators are human too: outcome bias and perceptions of individual culpability in patient safety incident investigations.?BMJ Quality & Safety.
Sociological Safety? | The Sociological Workplace | Trivalent Safety Ecosystem
1 周Nearly a decade ago, implicit bias expert, Dr. Anthony Greenwald publicly said: “…unfortunately, we don’t yet know how to undo implicit biases” despite decades of research and immense amounts of data. “…scientists and researchers don’t know how to undo implicit bias,” “Only [the] individual can change themselves and ONLY if they think they need to change…” “these are things that individual people cannot by themselves change, and more important, they’re changes that require institutional changes that are just not about to happen, almost no one is motivated to do.” It’s all still true. Attempting to debias people, especially whole professions, is doomed in the absence of institutional change. #SociologicalSafety
Specialist at Island Health
1 周Important article. I might suggest a focus on the cognitive biases of executive leadership, since they are the group most interested in maintaining the fallacy of fundamentally safe systems, leaving of course the bad apples as the ‘explanation’ for the harm. Entire patient safety departments are run exclusively through the outcome bias lens, with little to no reflection or courageous counter arguments to the people they report up to.
Expert in Geriatric Emergency Medicine/Quality Safety Expert/ Professional Coach-Mentor/Medicolegal expert
1 周A great driver is the way in which legal deliberations are conducted in clinical negligence cases. The adjective “gross” is the most subjective and unreasonable used to elevate a negligence to the dizzying height of being a criminal offence…..Investigators are keen to ensure their reports satisfy legal teams and not safety specialists
Thanks for sharing, Ben!
CAAM, CPEng, FS Eng (TüV Rheinland), MIEAust, NER, and RPEQ
1 周The following may be of interest, despite that you may be aware etc: 1. He also pointed out the tendency for management to shift responsibility for accidents to the less powerful, typically the front-line workers, by invoking “human error” as the cause while ignoring the effects of working conditions, productivity pressures, and choice of technologies, for example. 2. MIT Professor of Humanities, Sociology and Anthropology Susan Silbey has criticized the simplistic notions of safety culture popular in healthcare (and other industries), which assert that complex, hard-to[1]control technologies can be managed by means of a healthy safety culture. These beliefs subtly shift the responsibility for accidents to low-level workers who are criticized for showing a deficient safety culture. ? 3. A bit off this topic, though may be of interest regarding quality, according to our historical discussion: ? The quality improvement movement in healthcare preceded the safety movement by about a decade. Ex: ? https://www.amazon.com.au/Still-Not-Safe-Middle-managing-American/dp/0190271264 Regards, Lyle.