So much can go wrong. Here’s how to make it go right.

So much can go wrong. Here’s how to make it go right.

If you read this post, you will see 37 of the hundreds of mental traps, errors, shortcuts, and biases that I catalog in my upcoming book, BIG DECISIONS: Why we make them poorly. How we can make them better .

And it is likely that you will come away with your head spinning after you read the eight short cases selected from my book manuscript that demonstrate that even the smartest people and best known organizations can fall prey to and suffer great damage from these traps and biases.

Before we get into the traps, biases and related cases, I need to address the question, why this list? That’s simple - these are the serious traps and biases that you most likely can can avoid, mitigate, or overcome by getting into a coached group of business leaders, owners, entrepreneurs, and professionals.

Being part of a coached group offers you more minds to tap, more perspectives to consider, more ideas for action or what not to do, exposure to new information and best practices, and more encouragement, all curated by your experienced and qualified business coach.

Simply, in a coached group, the coach and group can help you navigate, avoid, or mitigate the effects of these eight categories of mental traps and errors that lead to bad decision making, inaction, or misappropriate action.

1. Effects based on our lack of information and how we are led to misinterpret information

  • Availability heuristic, in which we base judgments on available information, even if it is not representative. We are programmed to immediately "fit" the information we have at hand to our experience – magnified by what is most current and what seems "like" or relevant to the situation - whether it is applicable and sufficient. We often assess the probability of an event by the ease with which related instances or occurrences come to mind, rather than by carefully examining the situation and alternatives.
  • All humans, are prone to making the fundamental cognitive error, in which we underestimate the contribution of our beliefs and theories to observation and judgement. We often don't recognize when we've made an interpretation and that there are other ways that the information could have been interpreted. The problem is that our past experiences color how we interpret evidence and new experiences.
  • Ignoring the base rate in analysis and decision making is called base rate neglect.?When we assess the probability of an event or a cause, we often make the error of ignoring predictive background information in favor of only the information at hand in the case under consideration. What’s at hand may not be representative.
  • Expectation bias, the way in which we are prone to see what we expect to see, when our expectations influence our perceptions.
  • Illusion of validity, which is when we “see” a pattern or a story in data that leads us to overestimate our ability to interpret the data and accurately predict outcomes as a result.
  • Confirmation bias. We tend to subconsciously discount, dismiss, or ignore evidence that threatens our favored beliefs while overweighting evidence supporting out favored beliefs, instead of seeking and coolly evaluating impartial evidence.
  • The narrative fallacy, which arises from our vulnerability to overinterpretation and our preference for simple stories over raw facts. We think in terms of stories and relationships between facts – never mind whether the story is accurate and whether the facts are linked as the story suggests. Often the result is a distorted mental representation of the world; we think we understand when we don’t really understand.


“A decade ago, the Fukushima Daiichi Nuclear Power Plant in Japan was damaged by a massive earthquake and overrun by a huge tsunami, leading to a melt down, radiation, an evacuation, and lasting damage. It turned out that engineers and regulators relied on insufficient and biased data on the likelihood of big earthquakes and tsunamis to predict the nature and likelihood of future earthquakes and tsunamis at the Fukushima site. Why? One explanation is that they made the Fundamental Cognitive Error and did not see how limited the information they had in hand was.”

BIG DECISIONS: 40 disastrous decisions and thousands of research studies tell us how to make a great decision when it really matters


2. Effects based on how we address risk

  • We often display an inability to properly assess and respond to risk. Probability neglect or risk blindness can be at work. This can lead us to overstate the risks of less harmful activities and therefore make us less likely pursue them. It also leads us to underrate the risks of more dangerous activities and therefore leads us to get into trouble.
  • Furthermore, we can both risk averse and risk seeking, according to prospect theory. Fear of disappointment can lead people to prefer a sure thing for a lesser gain to the high probability of a greater gain.?Likewise, fear of a large loss can lead people to accept a sure thing for a smaller loss versus the low probability of a greater loss. Also, the hope of a large gain can lead people to prefer a small chance of a large gain over a sure thing for a smaller gain. Likewise, the hope of avoiding a large loss can lead people to prefer a small chance of a large loss to the sure thing of a smaller loss.


“Self-driving cars have the promise of eliminating an estimated 94% of accidents cause by human error. A 2017 study for Rand assessed 500 different what-if scenarios for autonomous driving technology and found that in most of the scenarios the wait for near-perfect self-driving cars would cost tens of thousands of lives lost.

Yet, surveys show that in the minds of most people self driving cars are not yet safe enough to want to ride in them in fully autonomous mode. In a J.D. Power/National Association of Mutual Insurance Companies study, more than 40% of Americans said they would never ride in an autonomous vehicle. In an AAA study, 71% of respondents said they would be afraid to ride in autonomous vehicles and only 19% would be comfortable with their children and family members riding in such vehicles. Also, in a Reuters Ipsos survey in 2019, more than half respondents thought that autonomous vehicles were more dangerous than human-driven vehicles and two thirds said self-driving cars would have to demonstrate a higher standard of safety than human drivers before they would ride in them.

This is evidence of our inability to properly assess and respond to risk. In this case, it is likely that Probability Neglect or Risk Blindness is at work. We overstate the risks of the less harmful activity, that is, riding in a self-driving car that is not totally safe but safer than a human-driven car, and therefore are less likely to pursue the safer activity. And we under-rate the risk of the more dangerous activity, that is, driving ourselves around, and therefore are more likely to put ourselves in danger.”

BIG DECISIONS: 40 disastrous decisions and thousands of research studies tell us how to make a great decision when it really matters

3. Effects based on how we choose

  • Hyperbolic discounting (also called current moment bias or present bias), leads us to tend to have a stronger preference for immediate rather than for later payoffs. We make choices today that our future selves would prefer that we would not have made. We find it difficult to see ourselves in the future and to alter our current behaviors and expectations to have a better future. We often opt for current pleasure and leave the pain for later. Because of this, our organizations tend to opt for short-term gain rather than long-term sustainability.
  • Anchoring, our tendency to compare and contrast only a limited set of items, to fixate on a value or number that in turn gets compared to everything else. Anchoring produces a skewed perspective that can lead to bad choices.


“Yahoo! CEO and former Google executive Marissa Mayer believed that she could turn around Yahoo! when many others before her could not. Yet, over her five-year tenure starting in 2012, Yahoo! had large losses in advertising revenue and a 50% reduction in staff. In a fire sale, in 2016 Verizon agreed to acquire Yahoo! for $4.8 billion and then. after two massive breaches of the company’s customer data, cut the price it paid to $4.48 billion.

The Los Angeles Times reported that in 2015, when Yahoo! had a stock market capitalization of $34.7 billion, it valued its investment stake in China’s Alibaba, a Google competitor, at $29.4 billion and its 35.5% stake in Yahoo! Japan at $8.7 billion. The newspaper story concluded, “In other words, the stock market valued everything other than those two holdings - that is, everything subject to Mayer’s management - at negative $3.4 billion.”

The Street website wrote at the time of the sale, “For all of the goals for remaking Yahoo! that Mayer and CFO Ken Goldman brought to the early internet company, they ultimately became simple stewards of capital, focused on returning cash to shareholders, monetizing the Asian equity stakes and ultimately setting Yahoo! up for the sale to Verizon.” Forbes said when the sale was announced, “Yahoo squandered its massive head start and let each wave of new technology in search, social, and mobile pass it by.”

When Mayer announced to company’s sale to employees, she wrote that she “put lipstick on a pig,” explaining that the agreement to be acquired by Verizon for $4.8 billion was evidence of “the immense amount of value we’ve created” and stating, “We set out to transform the company - and we’ve made incredible progress.” Then she walked away with a $23 million golden parachute.

A way to view Mayer’s obviously skewed perspective is that she was subject to Anchoring, thinking that she was saving the company from from absolute failure and seeing the sale amount as a victory. True, the sale did extract some value, but it was hardly from a successful turn-around and the value was a far cry from the company’s one-time market cap of over $100 billion - without the Alibaba investment.”

BIG DECISIONS: 40 disastrous decisions and thousands of research studies tell us how to make a great decision when it really matters


4. Effects based on our desire not to deal with bad news and loss:

  • Bad news avoidance, in which we appear to treat bad news as something to be avoided. We don’t want to know.
  • The Ostrich effect, in which we ignore an obvious (negative) situation. We don’t want to have to act.
  • Loss aversion or loss avoidance, which can appear when we directly compare or weigh options against each other and losses loom larger than gains. Evolution appears to have led us to place more urgency on avoiding threats than on maximizing opportunities. The prospect of losses has become a more powerful motivator on our behavior than the promise of gains.?
  • Sunk cost fallacy, also called the investment trap or persistence of commitment, our tendency to persist in achieving a goal due to already committed expenditure and investment, including effort and attention, even when the prognosis for success is poor. It’s a case of “self-justification,” in which we persist in a failing action because we are justifying our previous decision. As a result, we are misallocating and wasting resources that are better invested in more promising opportunities. The greater the size of the sunk investment, the more people tend to invest further, even when the return on added investment appears not to be worthwhile. This can be seen as "throwing good money after bad," because the resources and effort are already lost, no matter what you do now.


“Why would two countries waste huge sums of money over four decades to develop an airplane that no one would buy except their own national airlines who were heavily subsidized to do so?

Consider the case of the Concorde, the first supersonic (SST) commercial airliner. The Concorde was built in a 42-year program by a consortium of British and French companies backed by their governments.

Concorde development began in 1962 based on a treaty between the two countries. The plane began commercial service in 1976 and flew for 27 years. The costs to develop the Concorde were £1.134 billion, which was funded by the UK and French governments. The cost to build the small number of Concordes produced for commercial service, 16, was £654 million, of which only £278 million was recovered through sales. This debt was also funded by the two governments.

Consultant Peter Saxton, a former RAF pilot and British Airways Captain, chief pilot and senior manager, says the Concorde was "a project which cost the British and French tax-payers a staggering amount for development and construction, was not well managed if massive cost overruns are anything to go by, never made anything close to a financial return for its investors (us), and led the British aircraft industry into a cul-de-sac." He calls it a "a stupendous example of a project that was kept alive for a whole raft of reasons, none of which seems to have included the serious intention of making a commercial return for investors. Those reasons...included maintaining technological expertise, providing employment, securing Britain’s entry into the European Common Market, and patriotism or prestige." Saxton surmises that the governments kept "throwing more good money after bad" because they "seemed prepared to pay the prestige premium no matter how high it rose."

It was this "escalation of commitment" that gave the Sunk Cost Fallacy a new name: the Concorde effect.”

BIG DECISIONS: 40 disastrous decisions and thousands of research studies tell us how to make a great decision when it really matters


5. Effects based on our difficulty in waiting for payoffs

  • impulsivity, which is choosing an immediately gratifying option at the cost of long-term happiness.
  • Action-oriented bias, in which we are driven to act without considering all the potential ramifications of our actions. The leads us to potentially overestimate the odds of positive outcomes and underestimate the odds of negative ones. It arises because of we may put too much faith in our ability to produce desired outcomes based on our tendency to take too much credit for our past successes.


“In this case, we see that even a leader of a staid professional firm can be trapped by very human responses in exceptional moments.

PricewaterhouseCoopers' partner and accountant Brian Cullinan believed that he and his associate were well prepared to properly dole out the envelopes with winners' names to presenters at the 2017 Oscars ceremony.

Yet, Cullinan made the biggest Oscar award mistake ever by handing presenter Warren Beatty the wrong envelope. Beatty's on-stage partner, Faye Dunaway, announced “La La Land” was the winner when “Moonlight” was actually the best picture winner.

The accounting firm partner was likely entrapped by many mental errors and biases. One of the obvious factors was his Impulsivity. He Tweeted from backstage a photo of Emma Stone holding her Best Actress Oscar, right before the winning picture announcement, rather than paying attention to his duties,, thereby choosing the immediately gratifying option at the cost of long-term happiness. The self-destructive nature of this action is amplified by hearing the allegation that Cullinan had been told by the Motion Picture Academy not to use social media during the ceremony.

Working for a big-name firm and having a big title does not protect one from biases and traps. Indeed, it can lead to rash action.”

BIG DECISIONS: 40 disastrous decisions and thousands of research studies tell us how to make a great decision when it really matters


6. Effects based on our egos:

  • The Dunning-Kruger effect, in which incompetent people fail to realize they are incompetent because they lack the skill to distinguish between competence and incompetence.
  • Egocentric bias, our the tendency to rely too heavily on our own perspective and to have a higher opinion of ourselves than merited.
  • Illusory superiority or superiority bias, when we think too much of ourselves, without good reason, relative to others.
  • Inability to self assess, in that people in general hold the premise that their performance in domains where they are not experts is at the 70% level, meaning above average. This is in spite of the fact that half of any population will be below average. A psychologist has said about the inability to self assess, "How can I know what I don't know when I don't know what I don't know?"
  • Self-serving bias, in which we tend to view ambiguous information in the way that most benefits us. A payoff can lead us to think we are right and compromise our values and our group’s values.
  • Choice blindness, in which people do not notice big differences between what they intended to do and what they really did, and then give after-the-fact reasons as a defense for their misaligned choice.
  • The planning fallacy, in which we tend to underestimate task-completion times. because of our natural optimism, belief in our capabilities, and inexperience. It is easier for us to envision the success of a plan or project than to forecast all the things that can go wrong and the time and effort required to deal with them.


"Lehman Brothers CEO Richard Fuld believed that the investment bank was adequately capitalized when it increased its leverage from 12-1 to 40-to-1, became a major player in securitizing subprime mortgages, relied on risky credit default swaps for protection and engaged in accounting maneuvers that disguised how much debt the firm had taken on. Also, he believed that the U.S. government would bail out the firm when the policies and actions he had enabled put the firm on the brink of failure in 2008.

Contrary to Fuld's belief, Lehman Brothers was woefully undercapitalized as the financial crisis arose. The federal government walked away from an implicit "too big to fail" tag that Fuld and others thought would be applied to the firm. Lehman Brothers failed and Fuld was disgraced.

Fuld's assuredness that his way was the road to great success for Lehman Brothers and that the U.S. government would backstop the firm suggests his incompetence, that he was captured by the Dunning–Kruger Effect (incompetent people often overestimate their abilities, competencies and characteristics and consider themselves more competent than others).

The lure of huge rewards can warp our sense of risk. We tend to overstretch to attain them. We don't see that our judgment is altered and therefore our decisions are more likely to go bad because they entail taking excessive risk."


BIG DECISIONS: 40 disastrous decisions and thousands of research studies tell us how to make a great decision when it really matters


7. Effects based on not addressing choice properly

  • The false dilemma fallacy, which is when the choice is presented as being between two options, when in fact one or more additional options exist. This establishes a false construct.
  • The exclusive alternatives trap, in which too few options are presented. We use traditional logic to operate on a limited set of options. Yet many situations should lead us to juggle multiple alternatives.
  • The paradox of choice, choice overload or decision paralysis. Having choices can be powerful and positive. But having too many options can overload and paralyze us. People become worried they’ll regret the choice they make.?Limiting options actually helps us avoid choice overload.?


"Mountain climbing guide Rob Hall believed that his expedition plan would get the Mount Everest climbers in his charge onto and down from the summit. He believed his hard rule that the team would turn around at 2 p.m. if they were not yet on top of the mountain would protect the climbers from disaster. He believed that allowing his climbers to express any dissenting views while the expedition made the final push would hurt their chances of success.

Nearly all the climbers on the summit push that day, including Hall and his team, kept climbing and arrived at the top after two o’clock. As a result, many climbers found themselves descending in darkness, well past midnight, as a ferocious blizzard enveloped the mountain. Five people died in this highly publicized 1996 disaster and many others barely escaped with their lives.

What might have led renowned guide Hall astray, beyond the judgment-skewing effects of high stress, and oxygen and sleep deprivation?

Evidence suggests he saw the climb as a "now or never" opportunity for his charges. However, other climbers including filmmaker David Breashears and his team who were on the mountain at the same time got to the top and down in following days. This "all or nothing" thinking suggests that Hall was trapped by a False Dilemma (when the choice is presented as being between two options, when in fact one or more additional options exist)."

BIG DECISIONS: 40 disastrous decisions and thousands of research studies tell us how to make a great decision when it really matters


8. Effects based on our tendency to avoid change

  • The default option, the option that will result if one does nothing, is often what people wind up opting for, whether or not it is good for them.
  • Negativity bias or status quo bias (also called system justification), in which we tend to normalize our current situation as our reference point and to defend and reinforce the status quo. We tend to view deviations - even changes from the status quo that will be in our and our group’s self interest - as riskier, less desirable, or simply too much e?ort.
  • Disconfirmation bias, also called motivated skepticism, which occurs when a person is more likely to accept information that supports previously held beliefs and more likely to dismiss information that refutes previously held beliefs. This bias can emerge when people subject disagreeable evidence to more scrutiny than agreeable evidence.


"The engineer who invented the digital camera worked for the giant of photography, Kodak. Kodak owned the patent for what the engineer invented. Yet, Kodak proceeded to bury the technology rather than commercialize it. Had it instead adopted the technology on a timely basis, today it could be the Apple of digital imaging.

The attitude at Kodak was that no one needed digital photos. Film and photos printed on paper, using silver halide technology, had ruled for 100 years and Kodak ruled film and paper photography.

How wrong was that conclusion! Evidence was ignored as digital photography caught hold. Biases reined at Kodak, likely including the Default Option. Simply, moving away from film and paper would be the difficult course of action. Staying in its decades-old lane was easier and what accrued when it ignored digital photography for far too long."


BIG DECISIONS: 40 disastrous decisions and thousands of research studies tell us how to make a great decision when it really matters


These seven group effects help show why a coach/facilitator is essential for an effective group. The coach/facilitator can help the group avoid:

  • The availability cascade, which is the self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in discourse.
  • The bandwagon effect, in which group members go with the flow and pick a pick a winner or a favorite. It causes behaviors, social norms, and memes to propagate — regardless of the evidence or motives in support. This bias has to do with our innate desire to fit in and conform.
  • The false consensus effect, which is seeing more consensus in those around us for our beliefs than is actually the case, leading us to not consider judgments different from ours.
  • Group think, in which we substitute the pride of membership in a group for valid reasons to support the group's policy. We may think, “If that's what our group thinks, then that's good enough for me. It's what I think, too.”
  • The group polarization effect, in which the initial biases of group members lead the group to shift either toward risk or toward caution. The group polarization effect is driven by who states the argument and the sequencing and number of arguments pro or con offered in discussion.
  • Illusion of explanatory depth, which is when group members generally think they understand one another even when no one really has a clear understanding of what is being considered. That arises because in areas in which we do not have expertise we rely on social consensus.
  • Shared information bias, which is the tendency for group members to spend more time and energy discussing information about which all members are already familiar and less time and energy discussing information of which only some members are aware.

The message: Being part of a support and accountability group led by a qualified and experienced business coach/facilitator can help you navigate away from the mental traps and biases that can otherwise lead you to bad decision making, inaction, or misappropriate action.

Chuck MacDonald

Director at Words Matter

3 年

Powerful stuff. Lee.

回复
Terry Schmidt, Strategy Execution Planner

Turn Strategic Goals into Real Results. Consultant/Trainer, Logical Framework Expert

3 年

Very thorough and thoughtful Lee. Thanks for sharing your brilliance!

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了