5 Cognitive Biases That Change Experts Must Understand

5 Cognitive Biases That Change Experts Must Understand

Human fallibility existed long before Alexander Pope said, “to err is human.” Over the last 50 years, researchers have begun to systematically study human error. The mistakes people make in decision making are not random, sometimes overshooting, and sometimes undershooting. They are called cognitive biases. They are systematic, skewing decisions in predictable ways.??

Cognitive biases and their cousins, the logical fallacies, have been among the hottest areas in psychology, neuroscience, and economics over the last four decades. The fact that humans are error-prone is not news, but the fact that the errors are systematic allows us to correct for them or be skeptical about how certain we ought to be.?

The list of biases and fallacies is very long. There are more than 100 that have been classified and studied. Because the error-causing biases and fallacies are systematic and predictable, a deeper understanding of them can lead to better decisions.?

The Ostrich Effect?

In bad markets, investors look up the value of their holdings up to 80 percent less often than in good markets. This is an example of the ostrich effect, a perception bias, which occurs when we avert our gaze from painful, risky, or difficult situations.

The ostrich effect leads us to ignore some of the most valuable information at our fingertips because, realistically, the uncomfortable stuff is where the “juice” is. What a business team least wants to discuss is what they will get the most value out of discussing.??

Failure to learn from failure has team, cultural, structural, and emotional causes. In leadership teams, this is made easier by the blame game, where others’ faults can be magnified and our own discomfort lessened. In business, it is hard to make time to review each play because business leaders are frequently juggling so many initiatives that still-live ones press for their limited attention.???

Countering these forces requires great leadership, courage, and discipline. Great leaders first ask, “Where did I go wrong?” and only then, “where did we go wrong?” and then “where did it go wrong?”??

Learning from failure is one place where a small amount spent on external advisors can yield vast dividends. Uninvolved in the setback, they can draw unemotional and nonpartisan conclusions. Furthermore, they can see dimensions of the problem that those deeply involved are certain to miss (for if they had not missed them, perhaps the project might have been successful.)?

Availability and Confirmation Biases?

Business people often skimp on assessing problems. The problems, they think, are obvious: “staff are unmotivated,” “market share is slipping,” “we lag the competition in innovation and time-to-market,” or “our costs are too high.??

What they miss when they skimp is that how they see the problem is the problem. The problems (as they see them) are symptoms with a myriad of interrelated causes. To intervene accurately in a business with a performance issue, the leader and her team need an accurate view of cause and effect.??

It is trivial to say, “we can only see what we can see,” but the availability bias tells us that what we see is all that matters, and we discount factors we cannot see. The confirmation bias means we are more likely to see data that proves us right than contradicts us.??

When framing business problems, we can only evaluate what we see. In a complex organization, that might not be much. The solution to the availability bias and the confirmation bias is a certain skepticism about our view of the world, “holding it lightly,” but also in being on diverse teams. The leader’s challenge, in a complex business, is helping his team gain perspective so that the team does not succumb to the availability bias.?

Escalation of Commitment and the Sunk Cost Bias?

The sunk cost bias causes leaders to overweight the past in deliberations rather than underweight it and leads to perhaps the most pernicious and costly change strategy error—the escalation of commitment.?????????????????????????????????????????????????????????????????

In everyday language, the escalation of commitment is akin to “throwing good money after bad,” and, more formally, “redoubling commitment to a failing endeavor.” Escalation of commitment happens because people do not like to feel like they have squandered resources or that past efforts have been in vain. Human beings self-justify, liking to feel that time and money have been well-spent, and that past decisions were good (or not that bad).??

Then, the confirmation bias seals the trap. Having invested, leaders seek out confirming evidence that things are going well. In executive teams, leaders do not want to lose face, so projects they have sponsored will continue to be endorsed long past what rationality would suggest.??

Escalation of commitment costs a lot of money because when projects or investments turn sour, decision making typically gets worse. Baring Brothers bank was founded in 1762, but one trader, Nick Leeson, escalating his commitments to a losing arbitrage strategy, brought it down in 1995 when his losses that started small grew to $1.3 billion.?

With its complex psychological, cultural, social, and political causes, the escalation problem defies trivial answers. The very tough challenge for the leader is to balance the desirable virtue of persistence in the face of difficulty with the undesirable vice, one of stubbornness and throwing good money after bad.??

Implications for Change Experts?

It is not too much of a generalization to say that biases are not typically part of change management knowledge today. That needs to change. Facilitation is one of the change expert’s essential skills, but the facilitation we generally practice comes from the tradition of humanistic psychology, loosely based on the counseling principles of Carl Rogers.?

In that tradition, the client’s interpretations and beliefs are only cautiously challenged. This is an empowering way of facilitating based on the assumption that learning and personal change happen best when the client draws their own conclusions.??

The understanding that clients may be quite wrong in many of their interpretations opens the door for a more robust level of challenge—one that may be counter-cultural in some schools of coaching and facilitation.?

Thus, there is room for experts who help facilitate executive decision making to substantially increase their level of usefulness by understanding the world of biases to better point out these blind spots.???

Alan Landers, MHRD

The Landers Consulting Group | CEO, FirstStep Communications | 48 years of OD-IO global experience | 2021/2022/2024 Top Ten Change Management Consultancy | Thought Leader

2 年

Many years ago, Stephen Haines, one of my mentors and the founder of the Centre for Strategic Management told me to always look for contradictory evidence. He believed his clients' experiences and thoughts were real but may not have included all the facts. The concept of contradictory evidence, the rise of positivism (e.g. Appreciative Inquiry), and dialogic OD reinforce the practice of believing and going further before acting. I enjoyed your article.

Paul M. Mastrangelo, Ph.D

I'm changing how workplace change works.

2 年

Thanks, Paul Gibbons, FRSA for this interesting take on cognitive biases. You are correct that consultants are brought into a "story" that has already been formed by the client. An easy example is when leaders read about Generation Effects and jump to the conclusion that their observations match that highly publicized myth (Availability Theory), failing to see any evidence that disproves that assumption (Confirmation Bias). This combination of biases is why the fields of OrgChange, OD, and HR need to cleanse themselves of the myths that may sell consulting services in the short term but damage professional credibility in the long term (e.g., only 33% of workers are engaged, supervisors are the most important driver of engagement, people resist all changes, two-thirds of all change projects fail...). When clients hear and remember these myths, it is more difficult for them to absorb contradictory evidence that reveals the truth.

Ignacio Etchebarne

Consultor en desarrollo de liderazgo | Columbia certified coach | Dr. en psicología

2 年

Wonderful post Paul!! I agree entirely that consultants should respectfully challenge their clients’ assumptions, not to bully them but to better serve them!! Additional cognitive biases I’d add are: - #LossAversion: What if we try to change and fail... Ok, and what’s the risks of not trying to change? -#FoundamentalAttributionError: You think that others do it on purpose (lack of motivation or commitment), while your behavior is always explained by your circumstances?? I’d check your processes, systems and context first; or maybe you are bringing in a Taylorist perspective to a complex scenario... - #FalseConsensus: “Feeling” or assuming that others agree with you doesn't constitute evidence of actual agreement!! As far as empathy can take us, we are usually lousy mind readers!! This extends to any other form of #EmotionalReasoning in which you assume to be right only because you “feel” right... - #GroupThinking: Or as I like to relabel it: “Leaders should always speak last!” Else, they create anchoring biases in their teams or fall prey to bootlickers... This also applies to leading or “loaded” questions (watch out for your non-verbals!!).

要查看或添加评论,请登录

Paul Gibbons的更多文章

社区洞察

其他会员也浏览了