ANTIDOTES AND GUARDRAILS
Protecting Ourselves Against Our Built-In Biases
As I wrote last week, our minds trick us.
We have inherited a variety of cognitive biases—tendencies of the mind to lean toward certain patterns of thinking that helped our ancestors draw quick, generally accurate conclusions in dangerous but simple environments that can work against us in our more complex (if less dangerous) world.
While these biases often lead us quickly to sufficiently true assumptions, we tend to over-rely on these biases to our own detriment.
When a decision is relatively minor, such as what should I have for lunch, we do not need to do an extensive analysis of our cognitive biases—the consequences of ordering a sub-par tuna sandwich at the local deli are not great. When deciding whether to acquire that exciting new start-up, however, we need to examine our biases.
When the stakes are low, go forward boldly; when the stakes are high, we need tools—antidotes and guardrails—to overcome the biases.
Here are some of those tools:
Managing Cognitive Dissonance
Cognitive dissonance is easy to see in other people, but almost impossible to see in ourselves. We may feel the result of the dissonance—i.e., the mental stress we feel when faced with information we don’t like—but our brains work to make it go away as quickly as possible, often using methods we don’t recognize—the cognitive biases identified in Section 1.
Here are some simple ways to reduce the negative effects of cognitive dissonance:
·????? Learn to look for cognitive dissonance in yourself—the internal stress when feedback or observations rub you the wrong way for reasons you can’t quite identify.
·????? Learn to observe cognitive dissonance in others without judgment—seeing it in action can help us see it earlier in ourselves.
·????? Avoid the temptation to flatly reject ideas that don’t fit your worldview. Explore, with an open mind, ideas you are tempted to flatly reject.
·????? Avoid either/or thinking—the giver of feedback may be biased AND you may still have the flaws he identified; the product may be ahead of its time AND it may still have some flaws, etc.
领英推荐
Skepticism
The ability to think clearly relies partly on attitude and partly on having the requisite skills and tools.
The attitude required for clear thinking is skepticism.
Skepticism is the attitude and practice of matching the evidence to the claim. It assumes both an open mind and a critical mind. It is a willingness to hear the other side of an argument and to change one’s mind if a compelling enough argument is made, while still being committed to rigorously challenging assumptions.
Skepticism is different from cynicism, which has come to mean the automatic dismissal of the new or strange. While cynicism starts with a closed mind, skepticism starts with an open mind and a clear-eyed willingness to question even one’s own most cherished beliefs and assumptions.
The patron saint of skepticism is the Scottish enlightenment philosopher David Hume, who asserted that we really can’t KNOW anything for sure but that a “wise man [or woman] proportions his belief to the evidence.” The late scientist and educator Carl Sagan popularized a version of Hume’s statement as “extraordinary claims require extraordinary evidence.”
Skepticism begins with assessing the plausibility of a claim: Given what we already know, how likely is it that a particular claim is true? The more implausible a claim is, the more evidence we should require (and the more rigorous we should be in evaluating the validity of that evidence). Further, the consequences of a claim affect the amount of evidence we should require.
For example, if someone tells me she owns a golden retriever and shows me a picture of herself with a golden retriever, I don’t need a lot of evidence to provisionally believe her. If I know she has a tendency to lie, I may hold some doubt, but it is not implausible that someone owns a golden retriever. And since the consequences related to whether or not she owns a golden retriever are minimal, I would not feel the need to seek a lot of evidence to support her claim.
It would be different, however, if someone tells me they can cure cancer by waving their healing hands over a person. Such a claim is highly implausible given what we know about science and cancer, and the consequences of someone seeking “healing hands” treatment for cancer are dire. I would require a LOT of evidence before believing such claims.
If we want to be good critical thinkers and claim to be seekers after truth, we should cultivate appropriate skepticism and remember Carl Sagan’s other famous quote: “It pays to keep an open mind, but not so open that your brains fall out.”
Once the right attitude is in place, some fundamental tools are helpful; the rest of this volume provides some of those tools.
The Magic Question: How do I know this to be true?
The best way to maintain appropriate skepticism is to constantly ask ourselves “How do I know this to be true?” and to be willing to prove ourselves wrong. Physicist Richard Feynman famously said, “It is important that you don’t fool yourself, and you are the easiest person to fool.” We must remember this, and always challenge our assumptions and beliefs. The more strongly we believe something or the more emotion we feel about a belief, the more important it is to ask ourselves “How do I know this to be true?”
(To be continued next week...)