We judge based on outcomes, not process...
Gareth Lock
Transforming Teams and Operations through Human-Centered Solutions | Keynote Speaker | Author | Pracademic
One of the most powerful psychological effects is the feeling of social conformance. As humans we are wired to be part of a group, we want to be associated with others who are similar in fashion, outlook, activity, training or just enjoying the same sport. Diving, in the main, is a social activity where we create friendships and want to help others learn, develop and have fun.
But...when we have an incident, the community doesn’t always help. You only have to look at the social media for the external judgments that take place, especially if a serious incident has happened or someone has been killed. In fact, it is normally only when someone dies do we hear about the event because it is not possible to hide the fact that someone has died. In addition, because there is a lack of detail, the commentary starts by judging that what they did was ’stupid’ and it should have been obvious that it would end this way. This lack of context and detail is a real challenge when it comes to improving diving safety. [See Learning Teams Blog]
However, when we look a little deeper, we can see there a few biases at play which lead us to make these negative judgements of others.
There are two key biases at play, the first is hindsight bias which is covered in another blog, the other is outcome bias. In essence, outcome bias is where we judge more critically when the outcome is severe than when it isn’t even if the process leading to the outcome was the same. The following two examples illustrate this.
A diver who dives independent twins and forgets to swap his regulators on a regular basis, exists the 30m (100ft) dive with 30 bar (~450psi) in one side, and 210 (~3200psi) bar in the other. He jokes that he had a lapse during the dive and forgot to check/swap his regulators. Another diver in a similar situation on another 30m (100ft) dive has his buddy come to him OOA near the end of the planned dive and he donates the working regulator in his mouth to the buddy who empties the cylinder quickly. Now they have to conduct an air share on the ascent and the OOA buddy panics and has a rapid ascent leading to severe DCS. Honestly, consider how you judged each of the dives in question.
Which one is judged more severely in your own mind? Now consider a slight variation, what if the second diver with twin independents was the same as the first and they hadn’t learned from the first dive? Would you judge them more severely? What if the OOA diver had died on the second dive? Would that have made the judgement even more severe? If you did, you are normal. If you didn’t, you have thought quite a bit about this.
Two full-cave cave divers dived in a system and approximately 2/3 the way to their turn-pressure, they got distracted and miss a point where the line changed direction, they swam over a gap and picked up a new line by accident. They did not see the gap. They continued to swim for another 4-5 minutes and the 2nd diver recognised that they are not in the cave that they should be having been in the system 3 years previously. The lead diver had never been there before and so didn’t know better. The following diver signalled that something wasn’t right, they turned around and found the jump they had swum over, missed, and then rejoined their original plan. Another pair of divers were in the same cave, and they too swam over the jump, but didn’t notice they were in the wrong cave and swam to the full extent of their gas plan. During the exit, a collapse happened elsewhere in the cave system causing a silt out, necessitating a blind-diver exit. Shortly after this, a failure of the manifold meant that a gas share was required. They reached the end of the line where the jump was and they drowned as they couldn’t find the exit line and run out of gas.
How would you judge the jumping of the line in the first case compared to the second?
In both cases, what if the diver was well known, a ‘star’ and a well respected member of the diving community? What about others involved? The skipper? The buddies? The instructor who trained them? We often look for someone to blame, rather than examining the system in which we as divers undertake our sport.
Research in other domains has shown that decision making is where the majority of accidents have their genesis. To improve diving safety, we need to understand the decision making process of those involved and why it made sense to make the decisions they did. Peer pressure, social conformance, time pressures, goal fixation…the list goes on. The diagram below from Amalberti et al, shows how we often move to more risky behaviours because of the pressures driving from the right-hand side of the diagram. Whilst this is based on healthcare, the concepts are applicable in many other domains including diving.
Violations and migrations in health care: a framework for understanding and management. Amalberti, 2006
The problem we have with diving fatalities is that the decision maker is normally dead which is why, in my opinion, fatalities are a really poor way to learn compared to non-fatal incidents. This opinion is based on the fact that we do not have a robust and independent accident investigation system with clear definitions of contributory factors associated with human factors. Furthermore, many fatalities are the subject of litigation where the aim is to find out who is at fault and not how we can learn. As a couple of examples of this, one training agency has at the top of their incident reporting form ‘This form is being prepared in the event of litigation’ which is unlikely to describe failures of the ‘system’, and at DEMA last year, one of the insurance underwriters stated ‘As long as your paperwork is complete, then you will be safe.’ inferring safe from litigation rather than operational safety. Neither of these attitudes help learning.
Therefore, in my opinion, non-fatal incidents are much more effective to learn from, but that requires a psychologically safe environment to be effective. A psychological safe environment is one where we will not be humiliated or made to feel less good about ourselves if we speak up about concerns we have, or if we make a mistake, or if we point out a mistake. Very rarely do people make decisions that will intentionally harm others or themselves, therefore it is essential to understand what shaped their decision making processes. A "Just Culture" is another way of describing a psychologically safe environment. When a ‘safe’ environment is in place, people are more likely to report their errors or mistakes so that others can learn. Within small communities, this can be made to work because there is a level of trust but this still needs to be actively managed in case divers fall back into old habits. What would be great is if the community could grow up and stop throwing rocks at others for being ‘human’ and making mistakes.
One of the key outputs from detailed narratives of non-fatal incidents is the stories which can be retold during training or coaching - in effect giving examples of ‘real world’ diving incidents/accidents and why drills or protocols are in place. By teaching divers the ‘why’ behind the ‘rules’ it allows them to problem solve more effectively when they encounter a situation which doesn’t match their training or ‘rules’. Consequently, they are now able to refer to the mental models which have been created which leads to better decision making. This is why experts make better decisions that beginners or inexperienced divers - they have more models to relate to. This simplistic model from Gary Klein about Recognition Primed Decision Making explains this process and shows why training should not just be about skills taught in a rote manner, but the ‘why’ something is done.
Recognition Primed Decision-making Model. Streetlights & Shadows. Klein, 2009.
So how do we address this?
First we must recognise that each event will have mulitple perspectives, and many of them different from 'ground truth'. Indeed, it is often difficult to determine what the ground truth is in diving accidents given the lack of 'black box' data. This means we have to rely on eye-witness testimonies which have been shown to be flawed in so many cases. Furthermore, each time a story is told, new memories are created with some information lost and other bits added which means the story evolves over time - this isn't lying, this is human memory encoding and recall at work! By understanding that each person has a different perspective and experience, we recognise that their decision-making process is theirs and not ours. If we truly want to understand why certain decisions were made, we must put ourselves in the shoes of others and take into account their experience, training, goals, pressures as these all shape and influence our decision making. The decisions they made were locally rational.
Secondly, we need to examine what is the point of the investigation. If it is to look to blame someone (as in the case of litigation) then we have to accept that learning is going to be limited and that the facts won't be available as to why it made sense, but rather the narrative which is developed will be about who is to blame. However, if we are looking to learn from this incident and improve future diving encounters, we must stop looking at who's fault it is, but rather how it made sense, what the systematic issues at hand were and how they influenced the decision making process, and correct (if possible) the system rather than fix the individual.
Finally, we need to create the environment whereby divers can talk about their mistakes, even the really 'stupid' ones. I often get sent stories about mistakes that have been made, but the authors/contributors do not want them to be made public for fear of being recognised and then socially-castigated. These stories involve all of the drivers above e.g. money, time, experience, goal-fixation...and highlight their influence on the sub-optimal decisions that were made. We need a psychologically safe environment. We need a Just Culture. We need to recognise we all make mistakes, irrespective of skills or experience. We need to stop judging people on outcomes, and look at how the process failed. Good outcomes don't always mean good process. Fixing the failed or flawed processes will improve diving safety. Focussing on outcomes won't.
If you want to learn more about Just Culture and judgement in the established environment of Air Taffic Management, visit Steve Shorrock's excellent article here
Footnote:
The Human Factors Academy provides two classes to improve human performance and reduce the likelihood of human error of occuring. The online class provides a comprehensive grounding of Human Factors giving you the basic skills need to improve human performance and reduce errror, whereas the classroom-based class is very comprehensive and intense with plenty of opportunity to learn from failure and error, and providing an opportunity to be reflective on behaviours and performance.
Online micro-class (9 modules of approximately 15 mins each) details are here. https://www.humanfactors.academy/p/microclass
Upcoming classroom-based course dates are here https://www.humanfactors.academy/p/dates
More information on Human Factors Skills in Diving classes can be found at https://www.humanfactors.academy
Safety Mythologist and Historian. The "Indiana Jones of Safety". Grumpy Old Safety Professional.
7 年Thanks Gareth - great work. The link to Shorrock's article seems to be dead/non-existent, by the way.