An Unauthorized Guide to Understand Your Brain

An Unauthorized Guide to Understand Your Brain

Empower yourself?by Uncovering the Surprising Truth Behind Your Thoughts and Decisions. This article is a Journey Through the Mental Shortcuts, Cognitive Traps, Bias & Fallacies of human Brain. I will also advice on how to overcome these cognitive biases and make better decisions.?

Nobel Prize winner Daniel Kahneman’s bestselling book

Thinking Fast and Slow

is one of the best book on this subject. In the book, he explains two thinking systems of our brain: "System A" is intuitive, emotional, and unconscious, usually responsible for quick and automatic judgments and decisions. On the other hand, "System B" is slow, deliberate and conscious;?usually responsible for careful thinking, problem solving and decision making.

While we probably think that we are in complete control of our decisions and thoughts, here are top ten (10) mind traps & fallacies which usually guide our decision making process. Yes, you guess it right, they live in System A of our brain.

Anchoring:

is a cognitive bias in which an individual relies too heavily on the first piece of information they receive (the "anchor") when making subsequent judgments and decisions. This can lead to inaccuracies in decision making even if it is not relevant or accurate.

For example:

  • A survey asks respondents to estimate the percentage of African countries that are members of the United Nations. If the survey first presents a high anchor (e.g., 90%), respondents will tend to provide higher estimates than if the survey presents a low anchor (e.g., 50%).
  • Online shopping websites use anchoring by prominently displaying the original price of a product next to the discounted price. This anchor makes the discounted price appear to be a better deal, even if the original price was artificially inflated.

We can avoid anchoring effect by actively seeking out multiple sources of information. Reframe the problem or question in a different way, so that the initial anchor is not given undue weight.?

Confirmation Bias:

is a cognitive bias in which individuals seek out and give more weight to information that supports their existing beliefs, while disregarding or undervaluing information that contradicts their beliefs.?

For Example:

  • A person who has strong political beliefs may only seek out news sources that align with their views, disregarding any information that contradicts their beliefs. This reinforces their political views and makes it difficult for them to consider alternative perspectives.
  • An investor may only pay attention to news articles that support their decision to invest in a particular stock, ignoring any negative news that suggests that their investment may not be wise.

To avoid confirmation bias, it is important to Seek out information from multiple sources, consider alternative perspectives and question your own beliefs.

Framing Effect:

refers to the phenomenon in which individuals' choices are influenced by the way information is presented to them, even if the information itself remains the same.?

For Example:

  • If a product is described as being "75% fat-free" versus "25% fat," people tend to view it differently despite both phrases conveying the same information.

To avoid the framing effect, it is important to present information objectively, without any biases, and in a manner that is easy to understand. Additionally, it can be helpful to present the options side-by-side to allow individuals to make their own comparisons.?

Availability Bias:

refers to a cognitive bias in which individuals form their judgments based on the most readily available and salient examples, rather than considering all relevant information. This happens because people often rely on the information that comes most easily to mind, rather than making a more deliberate and comprehensive analysis.

For Example:

  • A person fearing flying after hearing about several recent plane crashes in the news, even though statistically speaking, flying is still much safer than driving.

To avoid availability bias, one could seek out multiple sources of information, consider the base rate, reframe the problem and practice statistical thinking.

Overconfidence:

is a cognitive bias in which individuals have an exaggerated sense of their own abilities, knowledge, and the accuracy of their beliefs and predictions. This can lead to poor decision-making, overlook potential risks, or ignore relevant information that contradicts their beliefs.

For Example:

  • A driver overestimating their ability to safely drive in hazardous weather conditions, leading to a higher risk of an accident.

To avoid overconfidence, one can review past predictions and decisions to identify any biases or errors, accept and learn from mistakes, practice humility and acknowledge the limits of your knowledge and expertise.

Hindsight bias:

is a cognitive bias in which individuals believe, after an event has occurred, that they would have accurately predicted the outcome all along. This bias leads individuals to believe that the outcome was more predictable than it actually was, and can cause them to overlook other factors.

For Example:

  • An individual looking back on a sports match they watched, and feeling as though they could have accurately predicted the outcome, despite having had no prior knowledge of the teams or players involved.

Understanding and being aware of hindsight bias can help individuals avoid the temptation to oversimplify past events and make more informed decisions in the future.

Base rate neglect:

individuals ignore the prior probability of an event when making a judgment or prediction, and instead rely solely on new, available information. This can lead to erroneous conclusions and decisions.

For example:

  • A doctor is presented with a patient who has symptoms of a rare disease. The doctor is told that there is a new test available that is 95% accurate for diagnosing the disease. Without considering the prior probability of the disease (e.g. its base rate), the doctor might conclude that the patient has the disease, based solely on the accuracy of the test.

To avoid base rate neglect, it is important to consider both the new information and the prior probability of an event when making predictions or decisions and make informed decisions.?

Sunk Cost Fallacy:

is a cognitive bias in which individuals persist in an endeavor because of the investments they have made in it, even when the costs outweigh the benefits.?

For example:

  • A person continues to invest money in a failing business venture because they have already invested a large amount of money and don't want to admit that it was a mistake.

To avoid the sunk cost fallacy, it is important to evaluate each decision on its own merits, rather than considering past investments.?

Halo effect:

is a cognitive bias in which an individual's overall positive impression of a person or thing influences their perception of that person or thing's specific traits.?

For example:

  • A person who is considered attractive is also perceived as being more intelligent, kind, and competent, even though there is no evidence to support this connection.
  • A product that is marketed as being expensive is often perceived as being of higher quality, even if there is no correlation between the product's price and its quality.

To avoid the halo effect, it is important to actively seek out and consider multiple sources of information when making judgments and decisions.?

Baader Meinhof phenomenon:

is a cognitive bias in which a person becomes aware of a new word, concept, or information and suddenly notices it appearing frequently in their surroundings.

For example:

  • A person purchase a new car and start to notice it everywhere

I personally don't think we should not avoid Baader Meinhof phenomenon, rather use it in our favor. My life coach asked me to write the positive experiences on daily basis, which helped me gain higher gratitude for my circumstances.

First of all, thanks for reading this lengthy article, my dear reader. Let me know in comments:

  1. Which fallacies you think are most common in the people around you?
  2. Which fallacies you just been made aware of, in yourself?
  3. What other mind traps you want me to cover in my next article? (Availability Cascade, Negativity Bias, Choice-Supportive Bias, Confabulation, Egocentric Bias, Endowment Effect, Essentialism, False Consensus Effect, Fundamental Attribution Error, Illusory Superiority, Intuitive Predictions, In-group Bias, Loss Aversion, Negativity Bias, etc.)

要查看或添加评论,请登录

社区洞察

其他会员也浏览了