7 Cognitive Biases That Lead to Unsafe Work

7 Cognitive Biases That Lead to Unsafe Work

The following is adapted from Rethinking Hand Safety.

Have you ever thought, “Oh, that won’t happen to me,” even if statistics say it’s likely?

Do you always take the same route home from work, even though there might be other, better options?

Have you skipped wearing sunscreen because “just a couple of hours of sun won’t do much harm”?

If so, you’ve operated from an unconscious cognitive bias (specifically, the bias of ignoring the baseline, the default bias, and the bias of underestimating cumulative risk).

When we humans make decisions, we follow all kinds of assumptions based on what has happened to us before, what others have told us, and what we see right in front of our eyes. Without our natural biases, we wouldn’t really be able to function, as we wouldn’t be able to make any decisions at all.

Sometimes, though, our biases pose a danger to us, by leading us to act in unsafe ways. To be safer at work, we have to be consciously aware of our biases and act against them when needed. And yes, this applies to both management and workers.

Let’s take a closer look at seven cognitive biases and how they can lead to unsafe work.

#1: Overconfidence Bias

Hardly anyone considers themselves overconfident, but nearly everyone is. Put simply: Most people believe they are more agile and smarter, as well as better at most tasks, than they actually are.

As an example, do you believe that you are an above-average or below-average driver? No less than 93 percent of people who respond to that question say they are above average—and surely half of them are wrong. 

In the workplace, overconfidence translates to doing things like skipping safety processes, assuming we can work as safely at 4 p.m. as we did at 10 a.m., ignoring the fact that the floor is understaffed that day, not getting help when we need it, assuming we know how a machine functions under all conditions—the list goes on and on.

In short, overconfidence leads us to skip best practices we know we should do.

The worst thing about overconfidence is that you get positive reinforcement all the way up until disaster strikes. If you’re doing something in an unsafe way, and you do it 500 times without injury, you start forgetting that it’s unsafe. Then you get to 501.

#2: Ignoring Blind Spots

A worker uses a knife to trim the excess off an extruded plastic part. Her attention is on her knife, on the part she is holding, on the speed of the line. Without looking, she puts her hand down to grab a clamp—unaware that another worker has left an open knife on the table. 

When she cuts herself, she is the victim of a cognitive bias that assumes nothing has changed in the environment to endanger her. Never before has a worker left a knife there—so her habituation to the environment has caused a blind spot. 

Blind spots make us vulnerable and can be caused by any number of factors. For instance, sometimes obvious dangers can cause blind spots to less obvious dangers.

Suppose, for example, a factory has a metal-stamping machine which has crushed workers’ hands several times in the past, causing horrible injuries. When working with this machine, everyone watches the stamper like a hawk. Meanwhile, there’s also an automatic arm that moves each piece of metal out of the way for the next to be stamped. It has exposed gears that can pinch and destroy a finger, but no one has ever experienced that injury. This automatic arm may be a blind spot danger—unseen, hidden by the obvious danger.

#3: Confirmation Bias

The confirmation bias, as defined by Scott Plous in The Psychology of Judgment and Decision Making, refers to the natural human tendency to “search for, interpret, favor, and recall information in a way that confirms our pre-existing beliefs or hypotheses.”

In other words, we often see only what we expect to see.

In 7 Insights Into Safety Leadership, safety expert Thomas Krause relates an incident in which a miner with thirty years’ experience died when the roof of a tunnel collapsed. The collapse came right after he and an equally experienced foreman had inspected the tunnel for structural issues, and found none. After the incident, an investigation found no less than 137 missing bolts, along with obviously compromised roof planks.

How had the foreman and the miner not seen these problems? The answer was simple: they’d done previous inspections of other tunnels in this mine without seeing any problems, so they didn’t expect to see any in this tunnel. With the confirmation bias, not only do we see what we expect to see, but our pre-programmed brains actively ignore anything that contradicts our first assumptions. 

#4: Ignoring the Baseline

All of us tend to think that our own circumstances are somehow unique, and we tend to ignore the typical statistics governing our activities. That includes accident rates.

This cognitive bias has a name: “ignoring the baseline.” More than ignoring the baseline, we are often in active denial about the baseline.

Even though workers have heard about their fellows developing serious conditions from, say, handling fiberglass insulation without gloves, they continue to do it—assuming, somehow, that it won’t happen to them.

Accidents and injuries can happen to anyone, including you and your workers.

#5: Default Bias

Whenever a choice is presented to us, we tend to choose the defaults—it’s not just easier and quicker, but we assume that the defaults are somehow the safest bet.

That means that when a worker approaches a task or a machine, they always tend to look for the defaults—whether someone else has explained the other options, or not.

Because of this bias, we must be very thoughtful in what is established as the default. For instance, if you have five different kinds of safety gloves available in the workplace, you need to make it absolutely clear which are the default gloves for a particular kind of work. That might mean a big picture of the default gloves next to a particular machine, or better yet a rack with those specific gloves placed next to the machine.

#6: Underestimating Cumulative Risk

Humans tend to vastly underestimate cumulative risk—the things that are harming them slowly, over time.

Everyone knows the fable about frogs and boiling water. If you drop a frog into a pot of boiling water, it will jump out safely. But if you put the foolish amphibian in cold water and boil it slowly, it will not recognize the danger in time, and it will die. Unfortunately, this analogy can be applied again and again to work safety.

People handle “just a little bit” of a dangerous chemical every day until they develop skin conditions or neurological issues or cancer.

People use vibration tools which are causing neurological damage over several years without noticing until it’s too late.

Gloves wear thin and get holes, but people keep on using them. If these gloves had been issued on day one, they would never have been accepted.

#7: Recency and Availability Bias

The recency bias, also known as the “availability bias” of the human mind means that we tend to focus on top-of-mind, recent events, with lots of readily available information, giving them more importance than they deserve.

The recency and availability bias means that we are always looking at the immediate past for answers, instead of looking forward.

If somebody loses a finger working with a harvester on a farm, everyone is going to be very careful around harvesters for a while. Attention will be paid to new gloves, guardrails, and protocols around harvesters. Meanwhile, workers are deploying pesticides bare-handed, they’re fiddling with open tractor engines while the tractors are running, they’re operating power takeoff flywheels without any training.

The lost finger is a tragedy, but it could be an outlier injury, not a danger faced by many workers in every shift. Indeed, dangers may become hidden by the focus on this recent event, as no overall hazard assessments are taking place.

Train Yourself to Counteract Your Biases

These cognitive biases are largely unconscious. They happen instantly, without thinking. But that doesn’t mean that we are helpless against them.

We can overcome our biases with conscious thought. Simply by being aware of these common biases, we can begin to counteract them. 

Where safety is involved, we must pause, ask ourselves what biases might be at play, and question whether those biases could be leading to unsafe behavior. We can then take different actions, ensuring a safer work environment.

For more advice on creating a safer work environment, especially as related to hand protection, you can find Rethinking Hand Safety on Amazon.

JOE GENG grew up among the tanneries of Canada helping his father make gloves, and he has spent his entire life studying industrial hand safety, overseeing glove R&D, and consulting with leading companies like Toyota, Honda, SpaceX, General Motors, Bombardier Aircraft, and Shell Oil. He presently acts as vice president at Superior Glove, the Geng family business that is considered one of the world’s most innovative and disruptive glove manufacturers. Superior is a major global supplier to aerospace, automotive, oil & gas, and construction companies, and has been named one of Canada’s best-managed companies seven years in a row by Deloitte. Joe holds degrees from Trinity Western University and attended Reutlingen leather school in Germany.



要查看或添加评论,请登录

Joe Geng的更多文章

社区洞察

其他会员也浏览了