Thinking, Fast and Slow

Thinking, Fast and Slow

"Your mind is amazing. In a fraction of a second, it can identify a dangerous situation and how to get out of it. Or it can create a false narrative that puts you in even more danger. Psychologist and Nobel Prize winner Daniel Kahneman has dedicated his life to studying how the human brain works, and his discoveries are incredible."

How we make decisions, how we judge people and situations, and how the biases we hold affect every decision and judgment we make, has always interested me. Reading this book has had a profound impact on my own worldview, and it has given me a new perspective on these behaviors and judgments. It is very scientific, all backed up with math and facts, yet simple to understand and I highly recommend it to everyone seeking to dig down deep in understanding how we make decisions and be ready to understand how vulnerable we are when making them.

Here I summarized the most important concepts explained in the book:

TWO SYSTEMS

Two Systems

Your behavior is determined by 2 systems in your mind – one conscious and the other automatic.

  • System 1 (Automatic system) operates automatically and quickly, with little or no effort and no sense of voluntary control.
  • System 2 (Effortful system) allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration
No alt text provided for this image

System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine—usually. When System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment.

The division of labor between System 1 and System 2 is highly efficient: it minimizes effort and optimizes performance.

Ego Depletion

Controlling thoughts and behaviors is one of the tasks that System 2 performs and you have a budget for it. Findings demonstrate that an effort of will or self-control is tiring; if you have had to force yourself to do something, you will be less willing or less able to exert self-control when the next challenge comes around. This is the reason why you will be more likely to select a chocolate cake over a fruit salad after a cognitively loaded task.

Priming Effect

Thinks of the following two words: "bananas"; "vomit"

Our minds are wonderful associative machines, allowing us to associate words like these two and create a casual story in an automatic and unconscious way. Our actions and emotions can be primed by events of which we are not even aware. Because of this, we are susceptible to priming, in which a common association is invoked to move us in a particular direction or action. This is the basis for “nudges” and advertising using positive imagery.

E.g: reminding people of old age makes them walk more slowly, reminding them of money make them more independent and selfish.

“The world makes much less sense than you think. The coherence comes mostly from the way your mind works.”

`

Cognitive Ease

When you are in a state of cognitive ease, you are probably in a good mood, like what you see, believe what you hear, trust your intuitions, and feel that the current situation is comfortably familiar. You are also likely to be relatively casual and superficial in your thinking. It turns out that even the repetition of falsehood can lead people to accept it, despite knowing it’s untrue since the concept becomes familiar and is cognitively easy to process. Authoritarian institutions and marketers have always known this fact.

“We must be inclined to believe it because it has been repeated so often, but let’s think it through again.”

When you feel strained, it mobilizes your system 2 and you are more likely to be vigilant and suspicious, invest more effort in what you are doing, feel less comfortable, and make fewer errors, but you also are less intuitive and less creative than usual. Findings show that students perform better when the exams are written in a small or less legible font, as it induces cognitive strain and therefore less prone to logical errors.

“Let’s not dismiss their business plan just because the font makes it hard to read.”


The Halo Effect

We tend to like (or dislike) everything about a person overweighting first impressions. The sequence in which we observe characteristics of a person is often determined by chance and sometimes the subsequent information is mostly wasted.

“She knows nothing about this person’s management skills. All she is going by is the halo effect from a good presentation.”


“What You See Is All There Is” (WYSIATI)

System 1 is radically insensitive to both the quality and the quantity of the information that gives rise to impressions and intuitions. It seeks the consistency of information, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern.

"They made that big decision on the basis of a good report from one consultant. WYSIATI—what you see is all there is. They did not seem to realize how little information they had.”


Answering an Easier Question

When faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution. This is called heuristics.

  • Hard question asked to a famous CIO: "Why did you invest in Ford stocks?"
  • CIO's answer: "Boy, do they know how to make a car"
  • The easier question that determined his choice "Do I like cars?"
“The question we face is whether this candidate can succeed. The question we seem to answer is whether she interviews well. Let’s not substitute.”


HEURISTICS AND BIASES

Law of Small Numbers

The strong bias toward believing that small samples closely resemble the population from which they are drawn is also part of a larger story: we are prone to exaggerate the consistency and coherence of what we see. We seek patterns, suppressing ambiguity, and constructing stories that are as coherent as possible. WYSIATI.

“The sample of observations is too small to make any inferences. Let’s not follow the law of small numbers.”


The Anchoring Effect

It occurs when people consider a particular value for an unknown quantity before estimating that quantity. Example: “Is the height of the tallest redwood more or less than x feet? What is your best guess about the height of the tallest redwood?” When x was 1200, the answer to the second question was 844; when x was 180, the answer was 282.

People adjust less (stay closer to the anchor) when their mental resources are depleted.

“Our aim in the negotiation is to get them anchored on this number.”


Availability Heuristic

It refers to the ease with which instances come to mind. The importance of an idea is often judged by the fluency, ease, and emotional charge with which that idea comes to mind. Example: A plane crash that attracts media coverage will temporarily alter your feelings about the safety of flying. Accidents are on your mind, for a while, after you see a car burning at the side of the road, and the world is for a while a more dangerous place. Availability effects help explain the pattern of insurance purchase and protective action after disasters.

“Because of the coincidence of two planes crashing last month, she now prefers to take the train. That’s silly. The risk hasn’t really changed; it is an availability bias.”


Representativeness

Representativeness is where we use stereotypes to help us judge probabilities. For example, “you see a person reading The New York Times on the subway. Which of the following is a better bet about the reading stranger? 1) She has a Ph.D. 2) She does not have a college degree.” The sin of representativeness is where we might pick the second answer, even though the probability of PhDs on the subway is far less than people without degrees. Though a simple example, one way to resist the temptation of representativeness is to consider the base rate (in this case, the rate of PhDs vs. non-PhDs) and make the judgment from that.


Regression to the Mean

Our mind is strongly biased toward causal explanations and does not deal well with“mere statistics.” Regression to the mean is the statistical fact that any sequence of trials will eventually converge to the expected value (i.e., the mean). Example: “Depressed children treated with an energy drink improve significantly over a three-month period.“

You will automatically infer that the energy drink caused an improvement, but this conclusion is completely unjustified. Depressed children are an extreme group, they are more depressed than most other children—and extreme groups regress to the mean over time. The improvement has nothing to do with the energy drink.

”Our screening procedure is good but not perfect, so we should anticipate regression. We shouldn’t be surprised that the very best candidates often fail to meet our expectations.”


OVERCONFIDENCE

The Illusion of Understanding & Validity

Our mind is a sense-making organ that tries to build the best possible story from the information available to us, and if it is a good story, we believe it. If it finds inconsistencies, it will reduce the ease of our thoughts and we tend to disbelieve (again, WYSIATI). Paradoxically, it's easier to construct a coherent story when you know little.

When validating an idea/belief, the amount of evidence and its quality do not count for much, because poor evidence can make a very good story. For some of our most important beliefs, we have no evidence at all, except that people we love and trust hold these beliefs.

These two illusions foster the idea that we can understand the past which itself fosters overconfidence in our ability to predict the future. But the world and every event are highly unpredictable and high subjective confidence is not to be trusted as an indicator of accuracy.

“She has a coherent story that explains all she knows, and the coherence makes her feel good"


Intuition Vs Formulas

We, humans, are incorrigibly inconsistent in making a summary judgment of complex information. When evaluating the same information twice, we frequently give different answers. This is explained by our humor, environment, and recent events before making the evaluation. Formulas do not suffer from such problems. Given the same input, they always return the same answer. Research conducted suggests that to maximize predictive accuracy, final decisions should be left to formulas, especially in low-validity environments.

In our days, algorithms don′t have yet a good reputation, especially in the medical field. Experts argue strongly that it is unethical to rely on intuitive judgments for important decisions if an algorithm is available that will make fewer mistakes. Their rational argument is compelling, but it runs against a stubborn psychological reality: for most people, the cause of a mistake matters. The story of a child dying because an algorithm made a mistake is more poignant than the story of the same tragedy occurring as a result of human error, and the difference in emotional intensity is ready translated into a moral preference. A similar ethical debate is found in the public acceptance of Autonomous vehicles. Fortunately, the hostility to algorithms will probably soften as their role in everyday life continues to expand.

“Whenever we can replace human judgment with a formula, we should at least consider it.”


Optimism and Overconfidence

Both are manifestations of WYSIATI. Optimist and Overconfident people tend to be more cheerful, happy, resilient, even to have a stronger immune system making them live longer. They play a disproportionate role in shaping environments and their self-confidence is reinforced by the admiration of others. When action is needed, all this can be a good thing. But the evidence suggests that it can be costly as it fosters the illusion that outcomes are highly dependant on one's action while they often depend as much on the environment and its changes.

Confidence is valued over uncertainty. Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients. An unbiased appreciation of uncertainty is a cornerstone of rationality, but this is not what people and organizations want. (e.g highly confident candidates for the presidency are preferred over more thoughtful ones).

A Premortem Session

A technique to overcome the bias of overconfident optimism when making important and risky decisions in a company.

It consists of a brief session with a group of individuals who are knowledgeable about the decision. The premise of the session is a short speech: “Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.”

The main virtue of this technique is that it legitimates doubts and avoid have them suppressed by overconfident supporters of an idea.

“We should conduct a premortem session. Someone may come up with a threat we have neglected.”


CHOICES

Loss Aversion

This principle explains that losses loom larger than gains and the loss aversion rate is on average 1.5 to 2.5 larger. What's the smallest gain that I need to balance an equal chance to lose $100? It's on average, between $150 and $250.

In understanding the impact in our own lives, “bad emotions, bad parents, and bad feedback have more impact than good ones, and bad information is processed more thoroughly than good. The self is more motivated to avoid bad self-definitions than to pursue good ones. Bad impressions and bad stereotypes are quicker to form and more resistant to disconfirmation than good ones".

Endowment Effect

It refers to an emotional bias that causes individuals to value an owned object higher, often irrationally, than its market value. You are much more likely to place a higher value on an object that you already own than the value you would place on that same object if you did not own it. This effect is powered by emotional or symbolic attachments to the given object.


THE LESSON

Kahneman shows us that we humans, as a product of evolution, are not best equipped to deal with rational and logical situations. Our fears and biases weigh more than our rational skills and therefore we are prone to repeat the same cognitive errors and we are easily manipulated.

In a reality that's dominated by science and statistics, "maintaining one’s vigilance against biases is a chore—but the chance to avoid a costly mistake is sometimes worth the effort."




Antonio Polo de Alvarado

Machine Learning Engineer at Nymiz

3 年

Great book without a doubt !

Daniela Bortolin

English Language Trainer

3 年

I agree, Juan, this is a great book! ????

Sujith Verghese

IT Leader | Database and Middleware Manager | Technology Consultant | Change Manager | Leading DevOps Practices and Change Initiatives | AI Advocate ??

3 年

Amazing thank you Martin !

要查看或添加评论,请登录

社区洞察