The Case for Open-Minded Thinking
Eddi van W./Flickr

The Case for Open-Minded Thinking

If you want to make clear decisions about the future, you must learn to move past your biases and open your mind to new possibilities. That's the central message of Superforecasting. In it, Philip Tetlock profiles a small group of people who were able to successfully predict geopolitical events.

Tetlock defines two different types of thinkers:

Hedgehogs: folks who rely on one or two big ideas to understand the world and where it is going

Foxes: people who scoff at the idea of using one model to understand the world, and who instead seek out the best approach that fits

Want to guess who does a better job of predicting the future?

I hope you said foxes because they consistently win.

And now for the backstory. The Intelligence Advanced Research Projects Activity (IARPA) agency identifies and supports “high-risk, high-payoff” research. In 2010, they told Tetlock of their plans to sponsor a massive tournament to see who could create the best methods for creating intelligence forecasts such as:

  • Will the president of Tunisia flee to a cushy exile in the next month?
  • Will the euro fall below $1.20 in the next twelve months?

3,200 people passed through the initial stage of psychometric tests and started forecasting. To be clear, a somewhat random group of people from numerous walks of life set out to beat the forecasts of intelligence professionals. Many did.

Doug Lorch typifies the successful approach. He is actively open-minded (AOM). People like Doug actively seek out evidence and opinions that go against theirs. As Tetlock writes, “Beliefs are hypotheses to be tested, not treasures to be guarded.”

To take this idea a step further, Superforecasting makes the case that teams can also engage in AOM practices. But Tetlock argues that actively open-minded thinking is “an emergent property of the group itself, a property of the communication patterns among group members.”

I particularly like this passage from Tetlock’s book, which talks about the humility required to be a good forecaster:

The humility required for good judgment is not self-doubt – the sense that you are untalented, unintelligent, or unworthy. It is intellectual humility. It is a recognition that reality is profoundly complex, that seeing things clearly is a constant struggle when it can be done at all, and that human judgment must, therefore, be riddled with mistakes.

Human judgment must be riddled with mistakes. Not just other people's judgment. Your judgment, and mine, too.

In the video I've embedded below, Tetlock makes the following observations.

"You can think of hedgehogs in debates over political and economic issues as people who have a big ideological vision. Tom Friedman might be animated by a vision of, say, globalization: the world is flat. Libertarians are animated by the vision that there are free market solutions for the vast majority of problems that beset us. There are people on the left who see the need for major state intervention to address various inequities. There are environmentalists who think we’re on the cusp of an apocalypse of some sort. So you have people who are animated by a vision, and their forecasts are informed largely by that vision.

"Whereas the foxes tend to be more eclectic. They kind of pick and choose their ideas from a variety of schools of thought. They might be a little bit environmentalist and a little bit libertarian, or they might be a little bit socialist and a little bit hawkish on certain national security issues. They blend things in unusual ways, and they are harder to classify politically.

"Now in the early work, we found that the foxes who were more eclectic in their style of thinking were better forecasters than the hedgehogs. In the later work, we found something similar. We found that people who scored high on psychological measures of active open-mindedness and need for cognition, those people who scored high on those personality variables tended to do quite a bit better as forecasters."

You may not care about your ability to predict geopolitical events, but I’d argue that many of Tetlock’s findings have applications to any professional’s career. If you rely too much on one or two mental models, you will make bad decisions.

We see this all the time with leaders of previously successful firms who can’t recognize how much their industry has changed and who keep trying to use the strategy that got them here, instead of a new strategy capable of advancing their company to the next level. Instead of success, they sink into stagnation.

By all means, test a strategy that has worked for you in the past, but be prepared to replace it with one that works better.

Our world keeps changing, and so should you.

Bruce Kasanoff is a ghostwriter for thought leaders. He is the author of NEVER TELL PEOPLE WHAT YOU DO. This article is an updated and expanded version of one Bruce released in 2015.



it leaves us balance for improving and trying bigger and better things

回复
Kevin Frazier

Project Delivery Manager

7 年

Great article ! Love it

回复
Connie Warden

Author of "Speeding to Compassion: Life Lessons and Humor to the Highroad | Online Yoga Classes | Yoga Retreats | Acupuncturist

7 年

Love Tetlock's quote - “Beliefs are hypotheses to be tested, not treasures to be guarded.”

回复
Amy Wallin

CEO at Linked VA

7 年

A good read. Perfect!

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了