Confirmation Bias: How Choosing Makes You Stupid
THE PERILS OF FREE CHOICE
Let’s pretend for a moment that you have agreed to be a part of a study I’m conducting. I bring you in to a room and present you with six works of art. I then ask to you to rank order the six paintings from 1 to 6, with 1 being your most preferred and 6 being your least preferred. I further explain that you’ll be able to leave today with a painting of your choosing.
Now that you’ve completed the ranking assignment, I tell you that you can choose any one of the six paintings. Naturally you choose Number 1 seeing as how it was your most preferred and I retire to the back of the room to retrieve it. I return shortly with a worried look and apologetically tell you that the paintings you ranked 1, 2, 5 and 6 are all picked over, leaving the ones you ranked 3 and 4 remaining. You can still have your pick of either 3 or 4 and you decide on 3 given that it was your slight preference.
Now imagine that I give you two weeks off and invite you back into my office to rank the same six paintings in order of your preference. What do you hypothesize will have happened? Would your preferences remain the same or would they have shifted? What might account for them changing or staying the same?
Well, if you are like most people who participate in this experiment (commonly referred to as the “Free Choice Paradigm”) your preferences will have changed upon your return. Typically we see the painting that was chosen, previously ranked Number 3 will now have progressed into the Number 2 spot. Conversely, the painting that was not chosen, previously ranked Number 4, will now have fallen into the Number 5 spot. What accounts for such a dramatic change over such a short period of time? After all, both of the paintings represented a sort of middling preference, neither greatly prized or greatly disliked at the initial ranking. So how they have now migrated closer to the respective poles? Once again, the answer lies in our need to be special and to think of ourselves as competent, capable decisions makers who make choices based on rational criteria.
Dr. Dan Gilbert, Harvard professor and happiness researcher extraordinaire describes the thought process of participants thusly, “The one I got is really better than I thought. That other one I didn’t get sucks.” Once the participant has adopted an opinion, they begin to construct a list of reasons why their choice was the right one. Perhaps they tell themselves that they prefer the shading or the texture, or the way the painting frames a previously blank space in the living room. Whatever the specific reasons we are prone to build up our decisions immediately upon having made a commitment. What’s more, we play the other side of the fence too and begin to mount an offensive against the road not taken. We are at least as tenacious at tearing down the unchosen option as we are at building up our commitment, just ask anyone who has ever been broken up with by a partner that they “didn’t like anyway.”
The phenomenon mentioned above, whereby we talk up choices we’ve made and denigrate those we’ve passed on probably makes intuitive sense, but what if it goes deeper than that. Dan Gilbert and his team examined the impact of the Free Choice Paradigm on a group of subjects with anterograde amnesia; in other words, a group of hospitalized individuals unable to form new memories. Like their neurotypical (that is, without brain damage) peers, the amnesiac patients were asked to rank the paintings from 1 to 6 and were given the option to keep either painting 3 or 4. Upon choosing a painting, the researchers promised to mail the chosen painting in a few days and left the room.
Returning just 30 minutes later, the members of Dr. Gilbert’s team reintroduced themselves to the amnesiacs who, unable to form new memories, had no recollection of having met with them before or having performed the exercise. To ensure that the amnesic patients were truly unable to form memories, the researchers then asked them to point to the painting that they had chosen before, a task at which the patients performed less well than chance guessing! The patients are then put through the whole ranking exercise again, with astonishing results. Just as with the neurotypical control group, the amnesic patients “talked up” the choice they made and dismissed the painting not chosen, even though they had no memory of having made a choice at all! Clearly, our need to view ourselves as competent and intelligent lives somewhere so deeply within us that not even cognitive impairment can touch it.
POLARIZATION
As we’ve seen above, the very act of making a decision can move us away from moderation. Pair this tendency with the confirmation bias tendency to surround ourselves with like-minded others and you have a recipe for conflict, polarization and even extremism. Dick Cheney famously personified this tendency when he demanded that only Fox News be playing when he entered the room. The Vice President, highly criticized by his Democratic counterparts and liberal news outlets like MSNBC, wanted to surround himself with less critical viewpoints. And while it’s easy to pick on Vice President Cheney (he shot someone in the face for crying out loud!) we are all guilty of surrounding ourselves with like-minded others.
In general, we flock to those with whom we share a cultural, religious, political or ideological identity. In so doing, we surround ourselves with a chorus of “Yes People” who reinforce the validity of our opinions. Given the emotional wrangling we see is involved with confronting conflicting ideas, immersing ourselves in an ideologically homogenous pool is infinitely easier than the alternative. If everyone with whom we associate looks, acts and thinks like we do, we are able to “successfully” skirt a number of tough internal struggles.
THE GROUP IS DUMBER THAN THE SUM OF ITS PARTS
Some within-group socializing is natural and even healthy. Church groups offer social and financial support to their congregants. Groups of LGBTQ youth gather to express their shared joys and struggles and learn that, “It gets better.” In these and myriad other instances, groups of like minded people find support and encouragement that propels them toward bigger and better things. When intragroup homogeneity becomes problematic however, is when the need to maintain group purity leads to a lack of “cross-pollenization” between groups of different minds. Homogenous groups lead to what is called “group polarization”, a potentially dangerous dynamic.
Until the early 1960’s the prevailing theory of group risk-taking behavior was what is called “normalization theory”, the idea that group decisions would reflect an average of the norms of the people that comprised it. However, a 1961 Master’s thesis paper by Stoner began to question the normalization theory and propose what we now call group polarization – the tendency of a group to engage in behaviors and hold opinions more extreme than the average group member. The reasons why group polarization occur are complex, but some suggest that diffusion of responsibility is to blame. Members of the group feel more comfortable putting forth an extreme position because direct responsibility is less likely to accrue to them. Further, given that group members cannot read each others’ minds, they may also assume a degree of comfort or agreement with a polarizing viewpoint since they assume that the corpus of the group is also in agreement. Given these and other group dynamics, members become emboldened and take increasingly strident positions, comforted by the size of the group and the potential for anonymity if things go poorly.
So, what does this have to do with much of what you believe being wrong? The confirmation bias means that you seek out and internalize information that is a. consistent with what you already believe and b. is what you want to hear. Right from the outset, your lens is skewed toward maintaining the status quo and selective attention, not truth and growth. Next, we learned that as new information enters your view that might challenge your current opinions, you ward off those attacks by emotionally overriding inconvenient logic. As you make new choices and reinforce existing beliefs, you build up the beliefs you hold and tend to tear down those you don’t to make your worldview seem that much more veracious. And finally, you surround yourself with a group of like-minded individuals who nod approvingly at everything you espouse. All the while this group with which you affiliate is helping to de-centralize your opinion and move you further from the “others.” Is it any wonder then that there is some convenient untruth in the things you claim to know? The brain is a primitive creature – set up to maintain ego at the expense of enlightenment. For those concerned about living a meaningful life rather than just an easy life, it takes some commitment and some unlearning to move forward.
We have seen above that our cognitive processes are set up to be parsimonious, not enlightened. Our brains, left to their own devices, make life easy, not good. And while this doesn’t make life fulfilling, it does make some utilitarian sense. After all, we are confronted with myriad decisions daily – what color suit to wear, what to have for breakfast, how to proceed in a relationship – can you really blame us for wanting to put some of our thought processes on auto-pilot?! The trick, I think, is to auto-pilot on things that are of little consequence but to withhold judgment and accept greater ambiguity on things that matter. Odds are, you can drive to the store half asleep. You can buy the same milk, park in the same spot, and choose the same deodorant all without any adverse consequences. However, when we judge people with the same automatic nonchalance with which we buy milk, we have a problem. So, how do we overcome this automatic thinking when it comes to making investment decisions - we seek to become "truth scientists."
HOW DO I BECOME A TRUTH SCIENTIST?
We hold information that we want to believe and information that we do not want to believe to different standards. When a piece of information is presented to us that is consistent with our desired beliefs, we tend to ask, “Why CAN I believe this?” We look for confirmatory evidence, and in so doing, are likely to find bits and pieces of it, at which time we prematurely shut down our search for the truth. When a more difficult truth is presented to us, we tend to ask “Why CAN’T I believe this?” and immediately seek out disconfirmatory evidence. In a phrase, we look to support things we like and look to destroy things we don’t.
One of the hallmarks of science is that it searches for both confirmatory and disconfirmatory information in the search for truth. Sadly, humans tend to be much more one-sided in their own decision making, with the particular side depending on how hard to hear the answer may be. A person in love is likely to ask, “What are some of the reasons why I should marry this person” but would seldom consider “What might be some of the complications that would arise from marrying this person.” A person recovering from a period of long-term unemployment is likely to ask, “What are the benefits of taking this job?” but is unlikely to consider the ways in which taking the role might impede future professional growth. Socrates famously said that, “The unexamined life is not worth living” and your ability to follow his advice has profound implications for your ability to make wise financial decisions.
Happy to help your company to foster & scale the digital culture | Facilitating Change - monitoring Risk - measuring Success
8 年Excellent. Thanks for sharing.