Cognitive biases affecting foresight and anticipatory thinking
Miguel Jiménez
Futurist ? Author ? Speaker —— I help global decision-makers gain clarity and prepare for the future(s).
We all come loaded with biases. Yes, full of prejudices and assumptions that, in most cases, we are not even aware of. Some help us every day, and some others limit our abilities. At some point, when thinking about the future, we are our worst enemies by unconsciously limiting our anticipatory thinking capabilities.
Biases can be innate or learned. People may develop biases for or against an individual, a group, or a belief. Multiple elements affect our objectivity like cognitive biases, conflicts of interest, prejudices, statistical and contextual biases. Biases define our thinking and worldview, simplifying decision-making and daily chores by unconsciously accepting many social and cultural assumptions.
However, it is crucial to be aware of the possible biases at play when exploring future alternatives when doing foresight. These might affect the initial framing, the outcomes, and how individuals interpret reality, present time, signals, change, and many other key elements involved in an active exploration of the future.
What are biases, and why do they affect people?
Let's start with the most common set of biases: cognitive biases.
A cognitive bias is a systematic pattern of deviation from rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate assessment, illogical interpretation, or what is broadly called irrationality.
Although it may seem like such misperceptions would be aberrations, biases can help people find commonalities and shortcuts to assist in the navigation of everyday situations in life. Biases sustain worldviews and allow people to maintain coherence over time.
A continually evolving list of cognitive biases has been identified over the last six decades of research on human judgment and decision-making in cognitive science, social psychology, and behavioural economics.
Some biases are specific to groups and others to individuals; some affect decision-making, and others affect judgement; some affect memory and others influence motivation.
Biases shape most of what we believe to be our individuality, personality, and culture and run silently on our brains and society's background.
Most common biases affecting foresight practitioners
In any foresight endeavour, it is crucial to be vigilant for the appearance of the five most common cognitive biases. Any of them might limit both the interpretation and assumptions about the present, the starting point to explore the future, and the alternative possibilities about the future, the outcomes of the exploration.
The first and most important is the confirmation bias, the tendency to listen more often to information that confirms our existing beliefs. It might shorten our view and thinking, making our foresight activities shallow and superficial, prone to polarization and disagreement with third-parties. Thus, it might minimize the opportunities to lead alternative futures.
The hindsight bias is a common one that involves the tendency to see events, even random ones, as more predictable than they are. It's also commonly referred to as the "I knew it all along" phenomenon. In foresight, it might make our efforts predictable and boring, giving a false impression that doing foresight is not worthy.
The anchoring bias is the tendency for people to be overly influenced by the first piece of information presented. In foresight, it makes it difficult to scan for new things once a particular finding is on the table and introduced to the team. It limits our ability to see beyond the probable spectrum in the futures cone, eliminating chances of finding plausible or possible alternatives.
The ambiguity effect bias is the tendency to favour a choice with a known outcome rather than "take a chance" on an option with unknown probabilities. In foresight, it makes people reluctant to try or envision new things. It limits the ability to recognize the long-term benefits of "riskier" decisions when measured against the marginal gains resulting from "safer" choices.
And lastly, the bandwagon effect bias is the tendency to place much greater value on decisions that are likely to conform to current trends or please individuals within their existing (or desired) peer group. In foresight, it reduces the exploration and opportunities to lead alternative futures by following trends, policies, or decisions made by competitors or influencers in the industry. It limits the willingness to disagree with the present time.
However, these biases are not all that there is. In 2017, Buster Benson conceptualized and categorized a list of 188 cognitive biases published at Visual Capitalist, an online publication specialized in visual information.
Minimising biases in teams working with foresight
So how can teams overcome biases in foresight projects? It might seem obvious, but the answer is simple: increasing the diversity of people. Not everyone comes loaded with the same set of biases given the different professional, socio-economic and cultural values.
Remember, the initial phase of framing in any foresight endeavour is about reaching consensus and breaking barriers, biases, assumptions and judgements. It could be useful to collaboratively create a mindmap of related concepts around the research topic to expose initial gaps and assumptions.
However, that is not everything about cognitive biases. As people are not usually aware of their brain's inner mechanics, confronting different opinions and perspectives could be stressful.
In psychology, cognitive dissonance occurs when a person holds contradictory beliefs, ideas, or values and is typically experienced as psychological stress when they participate in an action that goes against one or more of them.
American sociologist Leon Festinger proposed in 1957 that people experiencing internal inconsistency tends to become psychologically uncomfortable and is motivated to reduce cognitive dissonance.
In this situation, people adjust their beliefs either by rationalizing or by avoiding circumstances and contradictory information. In foresight, this feature of human behaviour might limit a team's ability to explore beyond the organisation's purpose.
I'm sure you had experienced cognitive dissonance before, as most of us already did, but how will it affect your next foresight project? Do you have any strategy to help overcome it and use it to favour the team?
National Strategy Futures
3 年All the more reason to seek out diverse perspectives so that the biases and assumptions don't align to create group think or limit options in solutions. Thanks for sharing.
Senior Foresight for Policy Expert at European Commission Joint Research Centre
3 年Well constructed collective intelligence helps: https://ec.europa.eu/jrc/en/publication/eur-scientific-and-technical-research-reports/understanding-our-political-nature-how-put-knowledge-and-reason-heart-political-decision
Maaike Rijnders
Futures Aunty @Think Beyond
3 年Great summary of some of the key biases Miguel.