How Can Startups Overcome (Debias) Delusional Optimism?

How Can Startups Overcome (Debias) Delusional Optimism?

Introduction

No matter how detailed, the business scenarios used in planning are generally inadequate (Lovallo & Kahneman, 2003).

Because it is part of human nature to be optimistic and overconfident, professionals across various fields fall prey to optimistic overconfidence. There is nothing terrible about optimism per se. However, multiple shreds of evidence (Sharot, 2011; Lovallo & Kahneman, 2003) suggest that we tend to overestimate benefits and underestimate costs, leading business founders, executives and managers to chase initiatives that, most of the time, have a high likelihood of failure. For example, despite 81% of entrepreneurs believing in their high chances of success by rating their odds of success at 7 out of 10 or higher (Cooper et al., 1988), actual data from the Finder (2023) reports that 62% of newly incorporated businesses will fail within the next five years. Other evidence suggests that around 80% of startups fail within 18 months (Wagner, 2013). This also sharply contrasts with the data from the U.S. Bureau of Labor Statistics (2016), demonstrating a 56% failure rate by the end of the 5th year. This contrast between the entrepreneur's belief and reality suggests they are more confident about their success than the statistics would support, suggesting overoptimism is widespread among business leaders. However, optimism bias is not the only bias that causes business founders, executives, and managers to fail. To that extent, this essay aims to target business-oriented biases and provide behavioural interventions by leveraging the MINDSPACE framework (BIT, 2010), thus enhancing decision-making processes and improving the likelihood of companies’ success.

The Impact of Delusional Optimism on Business Decision-Making and Outcomes

In the business context, multiple studies (Lovalo & Kahneman, 2003; Lovallo & Sibony, 2010; Roessler et al., 2019; Hodgkinson et al., 2023) identified cognitive biases that have the most significant impact on business decisions. Optimism bias is just the tip of the iceberg that is further magnified by a subset of cognitive biases such as optimism bias, groupthink, sunk-cost fallacy, and other business-oriented biases. This section aims to focus on the abovementioned biases. However, it is worth mentioning that many other cognitive biases relate to business decision-making contexts but are not included in this essay.

1) Optimism bias manifests in many ways.

Source: thedecisionlab

For example, multiple pieces of research tap into individual cognition errors regarding overoptimism and overconfidence. In a famous study on car drivers, where researchers asked subjects if they consider themself better than average drivers. A whopping 93% believed they were more skilful drivers than others (Svenson, 1981). Another research on 300 investment fund managers revealed that 74% considered themselves better than average in their roles, and the remaining 26% believed they were at least average (Montier, 2006). The research of the university professors revealed that over 90% believed they were better than average instructors, with two-thirds considering themselves among the top 25% (Russo & Schoemaker, 1992). A recent study on optimistic overconfidence of law students' academic prediction revealed that 95% believed they would finish in at least the top half of the class (Barder & Robbennolt, 2023). In a business context, optimism bias leads to missing deadlines and budget over-spends (McCray et al., 2002; Pickrell, 1992). Further magnified by a planning fallacy (Kahneman & Tversky, 1979), our tendency to underestimate the time it will take to complete a task, as well as the costs and risks associated with that task. The leading cause of the planning fallacy is that we tend not to consider our past experiences with similar tasks or projects, and we often take an inside view - the tendency to focus on our specific circumstances and search for the answers in our own experiences and beliefs (Kahneman, 2011).

2) Groupthink bias. It is a tendency to be influenced by others' opinions when operating within the same group. Humans are inherently “social animals”.

Source: thedecisionlab

Throughout human history, we always relied on groups to make critical decisions. We tend to reach a consensus without critical reasoning or considering possible consequences or alternatives that can sometimes cause dramatic failures (Janis, 1972). On Jan 28, 1986, the space shuttle Challenger exploded 73 seconds after launch. The investigation concluded that a series of poor decisions led to the deaths of all astronauts. The day before the launch, one engineer warned the flight manager of NASA that O-ring seals on the booster rockets would most likely fail because of the forecasted temperature for that morning. The O-ring was not designed for temperatures below 53°F or 11°C. NASA personnel ignored the scientific facts presented by the engineer, an expert in the field and became victims of groupthink bias. (Schwartz & Wald, 2003). In a business context, groupthink often causes employees and managers to overlook potential problems by pursuing a consensus decision-making process. When someone has a pessimistic opinion, they would rather suppress it in the group discussion to avoid challenging the current status quo, especially if the majority of the group fallback to a consensus thinking to avoid potential conflicts with their peers or managers.

When pessimistic opinions are suppressed, while optimistic ones are rewarded, an organisation’s ability to think critically is undermined. (Lovallo & Kahneman, 2003).

On the other hand, groupthink is not always a bad thing. It allows groups and businesses to make decisions fast, especially in competitive environments when sometimes it is faster to make a call and iterate based on the feedback and signals they would collect from the market, fostering a data-informed decision-making process.

3) Sunk-cost fallacy. It is a tendency to pay attention to historical costs that are not recoverable when considering future courses of action. In other words - continue investing in the failing plan (Dijkstra & Hong, 2019).

Source: thedecisionlab

In a nutshell, sunk-cost bias causes decision-makers to throw good money after bad or fail to recognise the point when investing more money into the project is not justifiable, causing budget overruns and continuing investment into the projects that can not be recovered (Arkes & Blumer, 1985).

The following section will address the abovementioned biases through the lens of the MIDSPACE framework to reduce their impact on the day-to-day decision-making process of business owners, managers, and teams.

How Do We Outsmart Biases? Boosting Business and Project Success with Behavioral Interventions.

Multiple shreds of evidence suggest that debiasing interventions (Sellier et al., 2019; Jolls & Sunstein, 2006) could be effective in improving people’s decision-making processes across different fields, including business settings and private lives. The systematic review (Ludolph & Schulz, 2017) on debiasing health-related judgments and decision-making reveals that almost 70% of debiasing interventions were entirely or partially successful. However, the usefulness of debiasing is often debated. Nevertheless, most of the interventions reviewed in this study are found to be effective, pointing to the utility of debiasing. In particular, technological strategies offer a novel opportunity to pursue debiasing outside the lab. The combination of debiasing strategies and the effectiveness of the MINDSPACE framework could help reduce the impact of cognitive biases that cause businesses and projects to fail. This section recommends behavioural interventions to help combat optimism bias, groupthink bias, and sunk-cost fallacy bias.

Overcome Optimism Bias

Building on top of Schwartz and Wilde’s (1983, pp. 1437-38) observation about the role of availability,

Jolls and Sunstein (2006) suggest that one response to optimistically biased individuals is debiasing through the availability heuristic and anchoring. Availability heuristic is a mental shortcut that relies on the immediate examples that come to mind when someone evaluates a specific topic or decision (Tversky & Kahneman, 1973). Hence, availability is a promising strategy to debias overly optimistic people. Anchoring bias - our tendency to rely heavily on the first piece of information (anchor) when judging or making decisions.

In a business context, one way to apply this strategy is to always have data in front (salient) of those who make decisions. This would require firms to collect the data and make it easily accessible.

Within technology product development organisations, engineering teams frequently gauge the effort needed to complete projects. By gathering and highlighting historical data from previously completed projects - especially at the outset of new initiatives - teams can calibrate the expectations of overly optimistic individuals towards greater accuracy. Moreover, presenting this historical data can act as an anchor to align initial estimates more closely with past outcomes. This alignment can be facilitated by deploying plugins that integrate seamlessly with the planning and progress-tracking tools (e.g. Jira, Github, Gitlab, Slack, etc.) these teams use. Alternatively, maintaining a digital dashboard that is always "on" in front of the team could yield a similar outcome. However, the optimal approach to introducing such interventions is to implement them at the appropriate time — when individuals are faced with choices and need to make decisions. Theoretically, integrating such a solution into the team's workflow should lead to improved results.


Overcome Groupthink

There are not that many studies addressing evidence-based interventions to combat groupthink (Hart, 1991; HM Government, 2016; Sunstein & Hastie, 2014; Sims, 1992). However, the theoretical background is quite solid and gives the opportunities to test it in the “field”. The main objective in group decision-making is to ensure that the members of the group aggregate the data they have and filter out the faulty signals and “noise” in that data, as well as make sure there is no reputational pressure present. Sunstein and Hastie (2014) introduce seven strategies suggesting their effectiveness. I would focus on two strategies, namely, appointing the Devil’s Advocate and establishing a contrarian team - “Red Teaming”. The first strategy could be more effective on a smaller subset of a larger group - a team- whereas the latter could be more effective on a large group - an organisation.

  1. “Devil’s Advocate” as Default. The Devil's Advocate method can presumably be effective at a tactical level. It is employed within teams where one precondition is the establishment of trust, particularly when making "local" decisions that affect the team's success (e.g., during the kick-offs of new initiatives). This involves appointing a team member to act as the Devil's Advocate, who then holds and expresses a well-reasoned position that challenges the group's beliefs. Hence, it is proposed to adopt this strategy as part of the team’s ceremonies by setting this up as a team agreement policy.
  2. “Red teaming” as a Commitment. Related to the “Devil’s Advocate” but shown to be more effective (Sunstein & Hastie, 2014) “Red teaming” could be an effective solution. Presumably, it could be more effective on a large scale when the impact of a decision-making process goes beyond the team and could affect the entire organisation or department within the organisation. For example, a new strategy that the organisation is about to commit to. Similar to the “Devil’s Advocate”, we need to appoint a group of people that would act as a team with an idea to construct the strongest possible case against a proposal or a plan. To prevent possible conflicts between the groups, I would strongly recommend anonymising the process. This is similar to academic double-blind peer review, where authors do not know the reviewers, and vice versa - the reviewers do not know the author. This anonymity is intended to allow reviewers to provide honest feedback as well as for the author(s) to take the feedback without personalising it to a specific member of the organisation.


Overcome Sunk-Cost Fallacy

As in the case of groupthink, there are not many empirical studies testing behavioural interventions to overcome the sunk-cost fallacy. The sunk-cost fallacy may occur due to loss aversion (Kahneman & Tversky, 1974), which describes that the pain of losing is psychologically about twice as powerful as the pleasure of gaining. Most of the time, we tend to avoid losses rather than seek out potential gains. In a business environment, we often continue investing in projects that are pre-conditioned to fail. However, even when realising this, we tend to stay passive and avoid taking necessary actions because by taking actions, we would confirm our failure and the pain from the failure is about twice as much as the joy we feel from equivalent gains.

The pain of losing is psychologically about twice as powerful as the pleasure of gaining. (Kahneman & Tversky, 1979)

I propose a behavioural intervention that implements an automatic cost-benefit analysis and a recommendation engine.?

This engine would provide a potential forecast for reallocating the budget to other ongoing projects or new initiatives when a certain threshold is reached. For example, if a P1 project's ROI is less than "x%" over a period of "y" months, the system should trigger the following actions:

  1. Alerting the leadership about the project that has gone off track.
  2. Providing an alternative scenario for resource allocation to other initiatives.
  3. Taking action: freezing the initial project that is clearly underperforming and reallocating resources.

Such a system can offer salient and clear signals to the organisation's leadership about projects suffering from the sunk-cost fallacy. A comprehensive analysis like this could run automatically every quarter before the organisation plans for the next quarter. This would ensure that the leadership is constantly aware of the performance of projects under development or maintenance.


Conclusion

Leo Tolstoy begins "Anna Karenina" with one of the most famous opening lines:

Happy families are all alike; every unhappy family is unhappy in its own way.

This essay's delve into the complexities of business decision-making, particularly the delusional optimism at the root of many cognitive biases described here, echoes a similar sentiment. Just like every unhappy family has its own unique challenges, each business and its leaders struggle with their own set of biases and pitfalls in decision-making. On the other hand, behavioural science helped us to uncover that despite being different, we have one thing in common: our decision-making process and our judgments are heavily influenced by a set of cognitive biases and heuristics. This knowledge unlocks the opportunity to design behavioural interventions with such frameworks as MINDSPACE. It is important to note that the proposed interventions in this essay, while theoretically sound, lack empirical evidence and hence provide more theoretical insights rather than practical recommendations. To better understand what works and what doesn’t, conducting experiments both in the lab and in real-life settings is recommended as we advance with this essay.


References

1. Lovallo, D., & Kahneman, D. (2003). Delusions of success: How optimism undermines executives' Decisions. Harvard Business Review. https://bit.ly/3LTyvWO

2. Sharot, T. (2011). The optimism bias. Current Biology, Elsevier. https://doi.org/10.1016/j.cub.2011.10.030.

3. Cooper, Woo, C. Y., & Dunkelberg, W. C. (1988). Entrepreneurs’ perceived chances for success. Journal of Business Venturing, 3(2), 97–108. https://doi.org/10.1016/0883-9026(88)90020-1

4. UK Small Business Statistics 2023. (2023). Finder.com. https://www.finder.com/uk/small-business-statistics

5. Wagner, E. T. (2013). Five reasons 8 out of 10 businesses fail. Forbes. https://www.forbes.com/sites/ericwagner/2013/09/12/five-reasons-8-out-of-10-businesses-fail/?sh=6f5dfe206978

6. U.S. Bureau of Labor Statistics. (2016). Entrepreneurship and the U.S. Economy. https://www.bls.gov/bdm/entrepreneurship/bdm_chart3.htm

7. Behavioural Insights Team. (2010). MINDSPACE. https://bit.ly/3RQC3ga

8. Lovallo, & Sibony, O. (2010). The case for behavioural strategy. Journal of Direct, Data and Digital Marketing Practice, 12(1), 98–.

9. Roessler, Velamuri, V. K., & Schneckenberg, D. (2019). Corporate entrepreneurship initiatives: Antagonizing cognitive biases in business model design. R & D Management, 49(4), 509–533. https://doi.org/10.1111/radm.12340

10. Hodgkinson, Burkhard, B., Foss, N. J., Grichnik, D., Sarala, R. M., Tang, Y., & Van Essen, M. (2023). The Heuristics and Biases of Top Managers: Past, Present, and Future. Journal of Management Studies, 60(5), 1033–1063. https://doi.org/10.1111/joms.12937

11. Svenson. (1981). Are we all less risky and more skillful than our fellow drivers? Acta Psychologica, 47(2), 143–148. https://doi.org/10.1016/0001-6918(81)90005-6

12. Montier, J. (2006, February 2). Behaving badly. SSRN. https://ssrn.com/abstract=890563

13. Russo, J. E., & Schoemaker, P. J. H. (1992). Managing overconfidence. Sloan Management Review, 33(2), 7.

14. Barder, S., & Robbennolt, J. K. (2023). Optimistic overconfidence: A study of law student academic predictions. University of Illinois Law Review Online, 2023. University of Illinois College of Law Legal Studies Research Paper No. 23-16. https://ssrn.com/abstract=4524497

15. McCray, G. E., Purvis, R. L., & McCray, C. G. (2002). Project management under uncertainty: The

impact of heuristics and biases. Project Management Journal, 33(1), 39-61.

16. Kahneman, D. & Tversky, A. (1979), 'Intuition prediction: Biases and corrective procedures',

Management Science, 12, 313-27

17. Janis, I. L. (Irving L. (1972). Victims of groupthink?: a psychological study of foreign-policy decisions and fiascoes. Houghton, Mifflin.

18. Schwartz, J., & Wald, M. L. (2003, March 9). The nation: NASA's curse; Groupthink is 30 years old and still going strong. The New York Times. https://www.nytimes.com/2003/03/09/weekinreview/the-nation-nasa-s-curse-groupthink-is-30-years-old-and-still-going-strong.html

19. Dijkstra, K. A., & Hong, Y. Y. (2019). The feeling of throwing good money after bad: The role of affective reaction in the sunk-cost fallacy. PloS one, 14(1), e0209900. https://doi-org.gate3.library.lse.ac.uk/10.1371/journal.pone.0209900

20. Arkes, H. R., & Blumer, C. (1985). The psychology of sunk cost. Organizational Behavior and Human Decision Processes, 35(1), 124–140. https://doi.org/10.1016/0749-5978(85)90049-4

21. Sellier, A.-L., Scopelliti, I., & Morewedge, C. K. (2019). Debiasing Training Improves Decision Making in the Field. Psychological Science, 30(9), 1371-1379. https://doi-org.gate3.library.lse.ac.uk/10.1177/0956797619861429

22. Ludolph, R., & Schulz, P. J. (2018). Debiasing Health-Related Judgments and Decision Making: A Systematic Review. Medical Decision Making, 38(1), 3–13. https://doi.org/10.1177/0272989X17716672

23. Schwartz, A., & Wilde, L. L. (1983). Imperfect Information in Markets for Contract Terms: The Examples of Warranties and Security Interests. Virginia Law Review, 69(8), 1387–1485. https://doi.org/10.2307/1072774

24. Jolls, C., & Sunstein, C. R. (2006). Debiasing through Law. The Journal of Legal Studies, 35(1), 199–242. https://doi.org/10.1086/500096

25. Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232. https://doi.org/10.1016/0010-0285(73)90033-9

26. Paul’t Hart. (1991). Irving L. Janis’ Victims of Groupthink. Political Psychology, 12(2), 247–278. https://doi.org/10.2307/3791464

27. HM Government (2016). Understanding the Behavioural Drivers of Organisational Decision-Making

Rapid Evidence Assessment, HM Government, London

28. Sunstein, C. R., & Hastie, R. (2014). Making dumb groups smarter. Harvard Business Review. Retrieved from https://hbr.org/2014/12/making-dumb-groups-smarter

29. Sims, R. R. (1992). Linking Groupthink to Unethical Behavior in Organizations. Journal of Business Ethics, 11(9), 651–662. https://www.jstor.org/stable/25072319

30. Kahneman, D., & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, 47(2), 263–291. https://doi.org/10.2307/1914185




Eric Charles

Building the Accelerated and Trusted #Jupyter. Add GPU Kernels to JupyterLab, CLI. Also on SaaS. Designed to build AI Products ?? .#Data #AI #Python #DataScience

1 年

Naive optimism does not lead anywhere, agreed. But as I start in the morning, I still desperately need to be injected with optimism to complete my day. How does that fit in the picture?

回复
Maria Atanacio

??4x Salesforce Certified Administrator | RevOps | Flownatic & Automation Explorer |

1 年

Very thought provoking - really enjoyed reading this!

Alex Koshykov

CEO at YODD, COO at BeKey, host of Health2Tech - series of Digital Health events

1 年

I knew that my optimism wouldn't bring me anywhere:)

Woodley B. Preucil, CFA

Senior Managing Director

1 年

Julian Ustiyanovych Very informative.?Thanks for sharing.

Oleksandr Pozniak

Embedded Linux System Engineer at Burmester Audiosysteme GmbH

1 年

Ц?каво. Треба почитати.

要查看或添加评论,请登录

Julian Ustiyanovych的更多文章

社区洞察

其他会员也浏览了