Don’t let your biases guide your decision-making

Don’t let your biases guide your decision-making

WHAT BEHAVIORAL ECONOMICS CAN TEACH US ABOUT PLANNING FOR THE FUTURE

An icon within future studies, Pierre Wack, once said that the best and most difficult task of a futurist is to make people think of the world in a new way. He used the term ‘reperception’ to describe how people awaken to the possibility of the future being different from the past or from how you expect it to be. Wack, who headed scenario planning at Shell, believed that the greatest  accomplishment, but also the most difficult for people working with foresight and scenarios, is to facilitate this transformation. But why is it so difficult for us to think of alternatives to the way things are? Why are we predisposed to think in certain ways? This question has been on the mind of psychologist Daniel Kahneman for years. His writings contain clues not just to why we think the way we do in general, but more specifically how we think and make decisions about the future. What makes Kahneman’s work interesting from a futurist’s perspective is his focus on human errors in decision-making that arise from heuristics and biases. He thereby challenges the assumption held by economists for decades, that of the rational human acting based on objective self-interest. In so doing, he has provided strong arguments for looking at economics from a psychological angle, an area known as behavioural economics. Let’s take a closer look at some of the biases drawn from the work of Kahneman that specifically relate to the challenge of imagining alternative futures – as well as some of the red flags to look for when these biases are applied to decision-making.


‘CHANGE IS BAD FOR BUSINESS’ – The status quo bias

Pierre Wack saw the act of reperceiving as crucial to opening the minds of executives and making them understand either the risks of disruption to their business or the possibilities that exist for them in alternative futures. Fundamentally, Wack was talking about how to overcome the so-called ‘status  quo  bias’. The fundamental problem with this kind of bias is that it does not permit change to be positive. Change will, for a number of reasons, be interpreted as a threat, especially for incumbent businesses that have lowered costs on core processes substantially to increase competitiveness. Businesses in this situation have invested a lot of money in organising their offerings to be efficient, and they are the kings of low cost in what is typically a red ocean market. For many such businesses, no change is preferred to constant change, simply because the status quo (where the incumbent business is on top) is preferred to the available alternatives. When doing strategic foresight however, you sometimes find yourself in a situation where an executive, from a logical point of view, agrees to all the driving forces causing a specific scenario, yet chooses to ignore the scenario presented to them, close their eyes, and hope for the best. This can be especially perplexing to an external foresight consultant because it is no longer a question of having the right arguments or the right data. Rather, it becomes a question of feelings. Some people simply choose to ignore the facts because they hope things will turn out differently in the future than what the most likely scenario suggests.

This problem is confounded by the fact that especially big corporations often need a sizeable revenue stream to replace their cash cow, and new business very often fails to deliver enough revenue right away to be of interest. Forecasts of future revenue are rarely very reliable, primarily because new products or new technologies create new markets, the size of which are naturally hard to predict. No wonder hoping for the best, even in the face of radical change, sometimes seems to be the best approach. At its worst, the status quo bias can lead to what is known as ‘persistence of discredited beliefs’. In a now-famous study undertaken in the 1950s and described in the book When Prophecy Fails, psychologists studied a UFO cult that   was  convinced  that  the  world  would end on December 21, 1954. When in fact it did not, many of the members of the cult still clung to their beliefs, settling  on  alternative  explanations for why the world had not ended yet. One might not be so surprised that this happens in a cult, but the fact of the matter is that something similar also often happens in large corporations, behind the walls in the boardrooms, and in governments as well. This is a reason why it is one of the most important tasks of futurists to look for where opinions diverge between people within organisations and experts outside the organisation. When external experts have radically differing opinions about the state of the world than those inside the organisation, it is often a case of status quo bias, and that should raise a red flag.

 

WHEN IN DOUBT, GO WITH WHAT YOU KNOW’ – The confirmation bias

In many cases, the tendency to search for, interpret, and recall information that supports  one’s  own  beliefs  actively  stands in the way of choosing a better path forward. The ‘confirmation bias’ has been known for years, and rules to mitigate it are integrated into the scientific method and teachings of good scientific practice. However, it is very much a part of everyday media and politics, and it affects decision-making in many areas of society and business. As Kahneman points out in his book Thinking Fast and Slow, confirmation bias tends to be strongest with emotionally charged issues and entrenched beliefs. The current media reality, increasingly defined by online echo chambers, tends to feed our confirmation biases by creating spaces where we can easily have our existing beliefs confirmed by likeminded individuals. The largest study ever done on the spread of falsehoods on Twitter was published in Science in 2018, and the results confirmed that the confirmation bias thrives in our fast-paced social media reality. The study, which was conducted by MIT researchers, tracked how news circulates and found that hoaxes, rumours, and falsehoods consistently dominated the conversation on Twitter. In fact, stories containing false information tended to reach people six times quicker than stories containing factually correct information.

For executives, the confirmation bias manifests itself most often when they choose to only listen to people who share their own opinions. This impulse can be so strong that it ends up being a defining trait of an organisational culture. This can lead to information contradicting the established truth not being circulated or taken seriously. In other words, a self-imposed censorship can take hold, which means that disruptive business models or technologies that are around the corner may be ignored at the detriment of the organisation. Other times, decision-makers will have put in so much effort into committing to a specific strategy that there is a sunk cost connected to switching lanes, and so, an executive may do their best to continuously seek out arguments that confirm that the chosen strategy is the right one. This can blind one to the possibility that other directions may be more beneficial in the long term.

 

’THIS IDEA IS SO GOOD IT COULDN’T POSSIBLY FAIL’ – The optimism bias

One of the most commonly observed biases is called the ‘optimism bias’. In CIFS 2017 report “Evaluating the Hype”, we explored how this kind of bias often affects the assessment of what the impact of new technologies will be, and how fast they will reach maturity. Almost without exception, experts and media commentators alike tend to believe  that things move faster than they actually do. For this reason, when assessing the prospects or a technology’s future breakthrough, it may be necessary to add two, five, ten, or even twenty years to that assessment (depending of course of the technology).

There are several reasons for this delay that may not immediately come to mind.  For example, new technologies are often hemmed by standardisation issues, regulations impeding uptake, or high prices creating a tough transition between innovators and early adopters. Optimism bias often makes an appearance whenever people try to envision how things may look in the future, both in regard to their personal outlook and when assessing more general developments.  Kahneman argues that there are several reasons for this, chief among which is that our judgment is affected by the goals or end-states that we aim for or desire. That is a fancy way of saying wishful thinking.

Optimism bias is often found going hand in hand with confirmation bias. The sense that one’s own business is superior to the competitor is what happened to Martell, the producer of Barbie dolls, who found that despite having been able to fend off all the prior attacks on their core product, Bratz still managed to take a big market share to the big surprise of Martell’s management.

Optimism bias is often present when new technology sees the light of day. Some may remember the hydrogen bubble in the early 2000s, during which President George W. Bush said fuel cell cars would be competitive with internal combustion engines by 2010 and would eliminate over 11 million barrels of oil demand per day in the US by 2040. Today, there are fewer than 20,000 heavily subsidised hydrogen fuel cell vehicles on the roads globally, nowhere close to the target. This kind of bias is often found in politics and its’ usually easy to spot, just look for the words “ambitious plan”.

Interestingly enough research has shown that optimism bias in people in general is closely tied to mental well-being, with individuals suffering  from  depression  showing less signs optimism bias. The same study also made clear that even experts aren’t free from optimism bias: ‘Divorce lawyers underestimate the negative consequences of divorce, financial analysts expect improbably high profits, and medical doctors over-estimate the effectiveness of their treatment’, the researchers write.

 

AWARENESS IS THE FIRST STEP

The work of establishing what kind of biases are at play when we envision the future is of vital importance for how we plan for it. There are many other biases than the ones discussed here, and the work with identifying the ones that are specific to the field of futures studies and foresight is ongoing. The fundamental problem is that if we do not know what guides our decisions, we are not well equipped to make the right choices. This is especially true because more than ever, the problems we face in the future, be it climate change, loss of biodiversity, or pandemics, are shaped by the decisions we make today. For some of these problems, we don’t have the luxury of making the wrong decision.  When it comes to  climate change, time is running out. A big part of the explanation of why we have even gotten to this point is that we lack the imagination to see the future clearly because we have little or no past references to draw on. In order to get this point across, let us recall 9/11, a wild card event that permanently changed the global geopolitical landscape. Despite the chock of 9/11, It’s not that one could not have seen it coming. Al-Qaeda’s plans were known in advance by US intelligence since they had been disclosed in an interrogation with captured members of the terror network, but still the information was never acted on. Why? One explanation, the one that was put forward in the 9/11 commission report, has to do with something known as ‘availability heuristics’. This term explains how bits of information can be retrieved, generated, and combined from memory. In the case of the terrorists’ plans, there weren’t many similar historical instances of giant skyscrapers being hit by airplanes to draw from. The fact that this information did not exist in the minds of the individuals in possession of the relevant intelligence was taken as evidence that it would not happen. As the report concluded, it was fundamentally ‘a failure of imagination’.

Availability heuristics, as well as our active biases, are of huge importance whenever we try to assess the likelihood of wild cards or black swan events. The Fukushima nuclear reactor disaster and the depth of the housing market crash in the US in 2008 leading to the financial crisis, are other examples of how wrong things can go if we are not mindful of this.

Needless to say, not being able to foresee disasters or radical change has, in retrospect, often proven to be a case of biases rather than not being able to prepare for alternative futures. For governments and businesses to make better decisions, we need to understand what drives this decision- making in the first place. Equipped with this knowledge, one of the main goals of futurists, that of facilitating reperception as Wack pointed out, should become easier.

Epaminondas Christophilopoulos, PhD

CEO | Senior Foresight Expert | Strategist

4 年

And here comes the greatest challenge of our work -> to change the mindsets. And to add, some times the obsession to apply complicated methodologies for producing accurate scenarios has exactly the opposite result. It's better to simplify the process and produce even scenarios with "mistakes" and have greater participation in the process and ownership ...

回复
Daria (Dasha) Krivonos

CEO at Copenhagen Institute for Futures Studies

4 年

Spot on Martin.

Claus Sneppen

Skaber succesfulde kontorarbejdspladser og m?dekulturer i et hybridt arbejdsliv. Vi ses p? tusinde m?der i en digital virkelighed - den vigtigste er t?t p?... eller er det?

4 年

Thx - the world isn′t ?? linear:-)

回复

要查看或添加评论,请登录

Martin Kruse的更多文章

  • We hear you!

    We hear you!

    Every day on my way to work and I pass this message written on a wall: “Listen to “Greta Thunberg”. I read it as a cry…

    1 条评论

社区洞察

其他会员也浏览了