Safety - It’s all in the mind....
David Whiting
HSE Culture Specialist: Helping Businesses Identify, Connect & Engage with Safety Leadership and Culture
Sometimes silly questions challenge conventional ideas and prompt lateral thinking.
Silly questions use the power of imagination to great effect and so can help managers, and particularly risk managers, think the unthinkable.
Pinpointing desired behaviour through means such as management staff leading by example, and pinpointing undesired behaviour by means of mandatory procedures and regulations; measuring progress using predetermined criteria that record behaviour, detect gains and recommend room for improvement; feedback involving everyone acting responsibly towards co-workers, intervening to stop unsafe actions, and discussing the potential risks; heightening awareness of the consequences to avoid negative ones and encourage positive ones.
There is a need to focus more on behaviour
No matter how airtight safety management is, there is always one unpredictable component: human behaviour. In real life, around 90% of accidents are triggered by unsafe behaviour.
Whilst our brains are capable of such moments of true creation, they can also work against us at times.
Sticking with kids for a moment, we all know that playgrounds need to be dangerous enough to be challenging, but often are made ‘too safe’ in the pursuit of total safety, and therefore unused and unloved.
In such circumstances, there’s a real risk that children grow up without any real feel for risk and so expose themselves to even greater threats in adulthood. People, including children (who are people too, after all, despite appearances at times) don’t seek to minimise risk.
They seek to optimise it. They drive and swim and fight and love and play so that they can achieve what they desire, but they push themselves a bit at the same time too, so they continue to develop.
Thus, if things are made too safe, people (including children) start to figure out ways to make them dangerous again. We all prefer to live ‘on the edge’. There, we can be confident in our experience and also confront whatever unknowns we might face in the future.
We’re hardwired to enjoy risk: Overprotected, we will fail when something dangerous, unexpected and full of opportunity, appears, as it inevitably will.
- So what about adults? We all are wary of sharks – they’re threatening to look creatures with big teeth.
- Disaster movies tell us they’re scary.
- But did you know that coconuts kill more people every year than sharks – 150 versus 5, as a result of having a habit of falling from tall trees onto unsuspecting passers-by?
- This is an example of a heuristic –a mental shortcut that helps us make decisions and judgments quickly.
- In this case, it happens because our brains are hard-wired to help us avoid dangerous situations.
The trouble is, this hardwiring is deeply inherited and dates back to our ancestors in the jungles. It leads to the so-called ‘flight or fight’ response.
Daniel Kahneman and Amos Tversky in their book ‘Thinking Fast and Slow’ (Penguin Books, 2012) describe this as System 1 thinking – it’s our sub-conscious or so-called ‘crocodile’ brain at work.
It’s not, however, the full picture. They go on to describe System 2 thinking as well – this is what happens when we stop and apply logic to a situation.
The trouble is System 1 is about 200 times more powerful than System 2 so all of us easily lapse into System 1 thinking, even when we know it’s wrong.
Ever driven somewhere and arrived, without being conscious of the journey – that’s System 1 at work and this aspect of how our brains work can fool us all into making incorrect decisions at work – and that effects risk.
- This effect is worse when we’re tired, stressed or under pressure too.
- As risk managers, we can help our organisations make better decisions by forcibly taking a step back when making key decisions.
- Never make a key decision right at the end of the working week for instance.
- Come next week, you’ll regret it.
Working in teams raises other heuristic issues that can impact on our response to risk.
Whilst, not an exhaustive list, these include:
- Affinity (‘like me’) bias – we tend to trust people like ourselves more than we trust others. Think of your closest colleagues, then discount all of those of the same gender, then the same race and finally those with the same work or social background. Anyone left? If not, seriously consider taking a more diverse approach to future hires.
- Stereotype bias – we often see this at play when recruiting someone whom we employ because we’d had a positive experience with someone quite similar previously – or, of course, the opposite. We’re then surprised when they end up performing totally differently.
- Anchoring – this is where we rely too much on the first piece of information obtained when making a decision. An expectation of a high level of injuries at work, for instance, can make us feel that a reduction is good, whereas what’s really needed is the elimination of all injuries.
- Cognitive dissonance – where our actions don’t follow what we believe. For instance, smoking even though we know it’s bad for us, at work, the Challenger space shuttle disaster was a classic example of cognitive dissonance. All the engineers involved knew the craft would disintegrate on take off, yet they still launched.
- Groupthink – where we change our view or decision to ‘fit in’ with the crowd. This is often at the root cause of many an industrial accident or company failure. For instance, the failure of the western world’s banking system ten years ago could be explained by groupthink in the banking sector, coupled with affinity bias.
- Availability bias – where we overestimate the importance of the data we have, over and above the data we still need to source.
- Confirmation bias – where we decide an answer then go on to prove it’s the right one, discounting all other options.
- Gambler’s fallacy – believing that future probabilities are altered by past events, when in fact they are unchanged.
- Ostrich effect – avoiding negative information by pretending it doesn’t exist.
- Risk compensation – taking bigger risks when perceived safety increases: being more careful when perceived risk increases. For instance, it’s proven that we cycle faster when wearing a helmet.
- Authority bias – giving too much credence to the wearing of a uniform for instance. This is why airline pilots wear uniforms – so that passengers follow their instructions in an emergency.
- Status quo bias – ‘if it isn’t broke, don’t fix it’ – preferring the current state of affairs, or business model, to change. Many a company – think Kodak films or Blockbuster video – have failed as a result of this.
- Courtesy bias – giving an opinion or reaching a conclusion that is viewed as more socially acceptable, so as to avoid causing offense or creating controversy.
Some simple techniques to prevent this include:
- Framing decisions, by proactively framing its context.
- Not rushing to solve a problem.
- Allowing the most junior person to give their view or solution first.
- Seeking opposing and contradictory evidence.
- Gaining expert opinion.
- Nudging, such as how supermarkets get us to choose healthy snacks by placing them by the checkout.
- Considering alternatives. Using ‘what if’ analysis.
- And, building true diversity, but not just of defined characteristics such as gender and race, into your teams, including diversity of social and business background too.
Please Like below: or if you implement one or more of these ideas, I would love to hear how it worked. Get in touch with me at any of the comment below – Thank you.