Most Human Decisions Are Based on Cognitive Illusions
The human brain is a paradox. While humans can produce highly developed analytical and creative intelligence, they are also prone to make senseless errors. Why is this so? This question sparked the groundbreaking work by the psychologists Daniel Kahneman and Amos Tversky, who studied the psychological dynamics of human decision-making. According to the two psychologists, the answer is that people are nowhere near as rational as they think and are incredibly susceptible to unconscious biases that influence human decision-making to a far greater extent than we realize.
?
Kahneman and Tversky discovered that people engage in two different thinking modes in their day-to-day lives. They refer to these ways of thinking by the nondescript names System 1 and System 2. System 1 is fast thinking, which operates automatically with little or no effort by using heuristics and templates to navigate the world. System 1 thinking is highly proficient at identifying causal connections between events, sometimes even when there is no empirical basis for the connection. System 2, on the other hand, is slow thinking and involves deliberate attention to understanding details and the complex web of relationships among various components. Whereas System 1 is inherently intuitive, deterministic, and undoubting, System 2 is rational, probabilistic, and highly aware of uncertainty and doubt. Needless to say, these two ways of thinking are contextually very different.
?
System 1 and System 2 thinking are distinctly human capabilities that have given humanity an immense evolutionary advantage. We can develop complex intellectual structures such as mathematics, physics, and music via applications of System 2, and, thanks to System 1, humans have the unique capability to make judgments and decisions quickly from limited available information. In employing these two capabilities, Kahneman and Tversky found that, while we may perceive ourselves as predominately rational System 2 thinkers, the reality is most human judgments and decisions are based upon the more intuitive System 1 for the simple reason that we don’t have the time to do System 2 thinking.
?
However, while fast thinking is more useful in making immediate choices, it is also more likely to result in judgment errors, even though we tend to feel more confident when engaged in System 1 than when we employ System 2.? That’s because the mental narratives that are a natural byproduct of System 1 are likely to result in biases that often cause us to make confident decisions that are completely wrong.
?
Kahneman and Tversky’s research provided clear evidence that most human judgments and decisions—even those by experts—are based on System 1, which means elite authorities are not immune to the hazards of unconscious biases. The two psychologists discovered that how we formulate a situation heavily influences how we decide between alternative courses of action. And if this formulation is based on conscious or unconscious biases, experts are prone to make senseless errors, as we experienced during the recent Covid pandemic.
?
Kahneman and Tversky applied the label of “framing effects” to what they described as the unjustified influences of formulation on beliefs and preferences. In a series of experiments, they noticed that people did not choose between things; instead, they chose between descriptions of things. Thus, simply changing the framing—the description of a situation—could cause people to completely flip their attitude on how to respond to the situation.
?
One of those experiments, which was conducted before the recent pandemic, involved public health experts who were divided into two groups and were presented with the same problem:
?
Imagine that the United States is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people.
领英推荐
?The first group was asked to choose one of two alternative programs that were framed in terms of how many lives would be saved:
?If program A is adopted, 200 people will be saved.
?If program B is adopted, there is a one-third probability that 600 people will be saved and a two-thirds probability that no people will be saved.
?The second group was asked to choose between two alternative programs that were framed in terms of how many people would die:
?If program A is adopted, 400 people will die.
?If program B is adopted, there is a one-third probability that nobody will die and a two-thirds probability that 600 people will die.
?
Notice that the two framings were logically equivalent, and the expectations for each program presented were exactly the same. Nevertheless, Kahneman and Tversky found that the overwhelming majority of respondents in the first group chose program A, while the substantial majority in the second group chose program B. If the preferences were completely rational, the public health experts would make the same choice regardless of how the descriptions were framed. However, System 1 thinking is not rational and can be swayed by emotions. Thus, while saving 200 people sounds promising, 400 people dying is shocking.
?
This framing effect troubled Kahneman, who noted, “It is somewhat worrying that the officials who make decisions that affect everyone’s health can be swayed by such a superficial manipulation—but we must get used to the idea that even important decisions are influenced, if not governed, by System 1.”
?
The prime contribution of Kahneman and Tversky’s lifelong work is to convincingly demonstrate that, when it comes to human decision-making, System 1 is the default mode. Although we may perceive ourselves as thoughtful, rational decision-makers, the evidence says otherwise. All of us—even experts who profess to be data driven—are susceptible to forming cognitive illusions if our conclusions are based on limited evidence.
?
?
To learn more about improving decision making, see my new book Nobody Is Smarter Than Everybody: Why Self-Managed Teams Make Better Decisions and Deliver Extraordinary Results.
I apply X-ray vision (intuitive insights) spotting invisible opportunities or blocks to adapt thinking to leverage the challenge. #Adaptive Minds #Emotional Health #Decision-making Leadership #Podcaster
8 个月So true Rod. Add to that beliefs and other filters that make data irrelevant If that weren’t the case universal child care would have been in place a long time ago since the payback to the economy is high
Co-Founder & Director, Center for Cooperative Problem Solving, at Virginia Tech | Co-Instructor for KAI Accreditation Course
8 个月Would be interesting to run this study with KAI, examining differences between the more adaptive and the more innovative individuals. The question being, given the impulsivity aspect of the more innovative, does that mean there is a likelihood of them operating more in System 1 and the more adaptive with a likelihood of operating more in System 2.