The Fallacy of the Planning Fallacy
cognitiveiBiasParade.com

The Fallacy of the Planning Fallacy

The?Planning Fallacy?is well documented in many domains.?Bent?Flyvbjerg?has reported this issue in one of his books,?Mega Projects, and Risk. But the Planning Fallacy is more complex than just the optimism bias. Many of the root causes for cost overruns are based on the?politics?of large projects.

The?planning fallacy is ...

...a phenomenon in which predictions about how much time will be needed to complete a future task display an optimistic bias (underestimate the time needed). This phenomenon occurs regardless of the individual's knowledge that past tasks of a similar nature have taken longer to complete than generally planned.?The bias only affects predictions about one's own tasks; when outside observers predict task completion times, they show a pessimistic bias, overestimating the time needed.?The planning fallacy requires that predictions of current tasks' completion times are more optimistic than the beliefs about past completion times for similar projects and that predictions of the current tasks' completion times are more optimistic than the actual time needed to complete the tasks.

The critical notion here is about?one's estimates. This is the crucial reason for?

With all that said, there still is a large body of evidence that?estimating?is still a major problem.??

I have a colleague who is the former?Cost Analysis Director?of NASA. He has three reasons projects get in cost, schedule, and technical trouble:

  1. We couldn't know?- we're working in a domain where?discovery?is the case. We're inventing new physics and new drugs that have never been discovered. We're doing unprecedented development.?Most people using the term "we're exploring"?don't likely know what they're doing, and those paying are paying for that?exploring. Ask yourself if you're in the?education?or the?research and development?business.
  2. We didn't know?- we could have known, but we didn't want to. We couldn't afford to learn. We didn't have time to know. We were incapable of knowing because we were outside our domain. Would you hire someone who didn't do his homework to provide the solution you're paying for? Probably not. Then why accept?that we didn't know?as an excuse?
  3. We don't want to know?- we could have known, but if we knew, that'd be information that would cause this project to be canceled.

The Planning Fallacy

Daniel Kahneman?(Princeton) and?Amos Tversky?(Stanford) describe it as “the tendency to underestimate the time, costs, and risks of future actions and overestimate the benefit of those actions.”?The results are time and cost overruns as well as benefit shortfalls.?The concept is not new: they coined the term in the 1970s, and much research has taken place since. See the Resources below.

So the challenge is not to fall victim to this optimism bias and become a statistic in the Planning Fallacy.

How do we do that?

Here's our experience:

Start with a credible systems architecture with the topology of the delivered system:

  • By credible, I mean not a paper drawing on the wall but a SysML?description of the system and its components. SysML tool can be had for free along with commercial products.
  • Defining the interactions between the components is the critical issue in discovering the location for optimism. The Big Visible Chart from SysML needs to hang on the wall to see where these connections occur.
  • Without this BVC, the optimism is.?It is not that complicated, what could be the issue with our estimates.?
  • It's the interface where the project goes wrong. Self-contained components have problems for sure, but when connected to other members, this becomes a system of systems, resulting in an N^2?(n-squared) problem.

Look for reference classes for the components

  • Has anyone here done this before?
  • No,? Do we know anyone who knows anyone who's done this before?
  • Is there no system like this system in the world?
  • If the answer is NO, we need another approach - we're inventing new physics, and this project is a research project - act appropriately.?

Do we have any experience doing this work in the past?

  • No, why would we get hired to work on this project?
  • Yes, but we've failed in the past.
  • No problem. Did we learn anything?
  • Did we find the Root Cause of the past performance problems and take corrective actions?
  • Did we follow a known process (APOLLO) in Root Cause Analysis and?Corrective actions?
  • No, you're being optimistic that the problems won't come back

Do we have any sense of the Measures of the system that will drive cost?

  • Effectiveness?-?the operational measures of success closely related to the achievements of the mission or operational objectives evaluated in the working environment under a specific set of conditions.
  • Performance?- characterize physical or functional attributes relating to the system operation, measured or estimated under specific conditions.
  • Key Performance Parameters?- represent the capabilities and characteristics so significant?that failure to meet them can cause reevaluation, reassessing, or termination of the program.
  • Technical Performance Measures?- determine how well a system or system element is satisfying or expected to satisfy a technical requirement or goal.
  • All the ...ilities
  • We need to understand these to have a fundamental understanding of where the problems will be, and natural optimism comes out.
  • Do we know what technical and programmatic risks will be encountered in this project?
  • Do we have a risk register?
  • Do we know the reducible and irreducible risks to the project's success?
  • Do we have mitigation plans for the reducible risks?
  • For reducible risks without mitigation plans, do we have Management Reserve?
  • For irreducible risks, do we have cost and schedule margin?

Do we have a Plan showing the increasing maturing of the project deliverables?

  • Do we know what Accomplishments must be performed to increase the maturity of the deliverable?
  • Do we know the Criteria for each Accomplishment, so we can measure the progress to plan?
  • Have we arranged the work to produce the deliverables in a logical network or another method like Kanban that shows the dependencies between the work elements and the deliverables??
  • This notion of dependencies needs to be more underrated.?
  • The Kanban paradigm assumes this upfront.
  • Verifying there are NO dependencies is critical to all the processes based on having?NO?dependencies.?
  • It seems rare that those verifications take place.
  • This is an Optimism Bias?in the agile software development world.
  • Do we have a credible, statistically adjusted cost and schedule model for assessing the impact of any changes?
  • I'm confident our costs will not be higher than our revenue. Please show me your probabilistic model.
  • No model. We're likely being optimistic and don't even know it
  • Show Me The Numbers.

So With These And Others...We can remove the fallacy of the Planning Fallacy.

This doesn't mean our project will be successful. Nothing can guarantee that. But the?Probability of Success?will be increased.

In the end, we MUST know the Mission we are trying to accomplish and the units of measure of that Mission in terms meaningful to the decision makers. We need that to understand what DONE looks like. And with that, only our optimism will carry us along until it is too late to turn back.

No alt text provided for this image

Anyone using Planning Fallacy as the excuse for project failure, not planning, not estimating, and not doing their job as a project and business manager will likely succeed in the quest for project failure and get?what they deserve. Late, Over Budget, and the gadget they're building doesn't work as needed.

? Please note that because estimating is a problem in all domains, that's NO reason not to estimate. Like planning is a problem, it's no reason NOT to plan. Any suggestion that estimating or planning is unnecessary in the presence of an uncertain future - as it is on all projects - is willfully ignoring the principles of Microeconomics - making choices in the presence of uncertainty based on opportunity cost. To suggest otherwise confirms this ignorance.

Resources

These are some background on the Planning Fallacy problem from the anchoring and adjustment point of view that I've used over the years to inform our estimating processes for software intensive systems. After reading through these, I hope you come to a better understanding of many of the misconceptions about estimates and the fallacies of how that is done in practice.

Interestingly there is a poster on twitter in the #NoEstimates thread that objects when people post links to their own work or work of others. Please do not fall prey to the notion that everyone has an equally informed opinion, unless you yourself have done all the research needed to cover the foundations of the topics. Outside resource are the very life blood of informed experience and the opinions that come from that experience.?

  1. ?Kahneman, Daniel; Tversky, Amos (1979). "Intuitive prediction: biases and corrective procedures."TIMS Studies in Management Science?12: 313–327.
  2. ?"Exploring the Planning Fallacy"?(PDF). Journal of Personality and Social Psychology. 1994. Retrieved?7 November?2014.
  3. Estimating Software Project Effort Using Analogies,?
  4. Cost Estimation of Software Intensive Projects: A Survey of Current Practices
  5. "If you don't want to be late, enumerate: Unpacking Reduces the Planning Fallacy." Journal of Experimental Social Psychology. 15 October 2003. Retrieved?7 November?2014.
  6. A Causal Model for Software Cost Estimating Error, Albert L. Lederer and Jayesh Prasad,?IEEE Transactions On Software Engineering, Vol. 24, No. 2, February 1998.
  7. Assuring Software Cost Estimates? Is It An Oxymoron??2013 46th Hawaii International Conference on System Sciences.
  8. A Framework for the Analysis of Software Cost Estimating Accuracy,?ISESE'06, September 21–22, 2006, Rio de Janeiro, Brazil.?
  9. "Overcoming the Planning Fallacy Through Willpower." European Journal of Social Psychology. November 2000. Retrieved?22 November?2014.
  10. Buehler, Roger; Griffin, Dale, & Ross, Michael (2002). "Inside the planning fallacy: The causes and consequences of optimistic time predictions." In Thomas Gilovich, Dale Griffin, & Daniel Kahneman (Eds.),?Heuristics and biases: The psychology of intuitive judgment, pp. 250–270. Cambridge, UK: Cambridge University Press.
  11. Buehler, Roger; Dale Griffin; Michael Ross (1995). "It's about time: Optimistic predictions in work and love."?European Review of Social Psychology?(American Psychological Association)?6: 1–32.?doi:10.1080/14792779343000112.
  12. Lovallo, Dan; Daniel Kahneman (July 2003). "Delusions of Success: How Optimism Undermines Executives' Decisions."?Harvard Business Review: 56–63.
  13. Buehler, Roger; Dale Griffin; Michael Ross (1994). "Exploring the "planning fallacy": Why people underestimate their task completion times."?Journal of Personality and Social Psychology?(American Psychological Association)?67?(3): 366–381.?doi:10.1037/0022-3514.67.3.366.
  14. Buehler, Roger; Dale Griffin; Johanna Peetz (2010).?"The Planning Fallacy: Cognitive, Motivational, and Social Origins"?(PDF).?Advances in Experimental Social Psychology?(Academic Press)?43: 9.
  15. Hourglass Is Half Full or Half Empty: Temporal Framing and the Group Planning Fallacy". Group Dynamics: Theory, Research, and Practice. September 2005. Retrieved 22 November?2014.
  16. Stephanie P. Pezzoa. Mark V. Pezzob, and Eric R. Stone. "The social implications of planning: How public predictions bias future plans" Journal of Experimental Social Psychology, 2006, 221–227.
  17. "Underestimating the Duration of Future Events: Memory Incorrectly Used or Memory Bias?". American Psychological Association. September 2005. Retrieved?21 November?2014.
  18. "Focalism: A source of durability bias in affective forecasting.". American Psychological Association. May 2000. Retrieved?21 November?2014.
  19. Jones, Larry R; Euske, Kenneth J (October 1991).?"Strategic misrepresentation in budgeting."Journal of Public Administration Research and Theory?(Oxford University Press)?1?(4): 437–460. Retrieved?11 March?2013.
  20. Taleb, Nassem (2012-11-27).?Antifragile: Things That Gain from Disorder.?ISBN?9781400067824.
  21. "Allocating time to future tasks: The effect of task segmentation on planning fallacy bias." Memory & Cognition. June 2008. Retrieved?7 November?2014.
  22. "No Light at the End of his Tunnel: Boston's Central Artery/Third Harbor Tunnel Project." Project on Government Oversight. 1 February 1995. Retrieved?7 November?2014.
  23. "Denver International Airport"?(PDF). United States General Accounting Office. September 1995. Retrieved?7 November?2014. Lev Virine and Michael Trumper.?Project Decisions: The Art and Science, Vienna, VA: Management Concepts, 2008.?ISBN 978-1-56726-217-9?Michael and Lev provide the Risk Management tool we use -?Risky Project. Risky Project is a Monte Carlo Simulation tool for reducible and irreducible risk from probability distribution functions of the uncertainty in the project. By the way, it is an actual MCS tool not based on?bootstrapping?from a small number of past samples many times over.
  24. Overcoming the planning fallacy through willpower: effects of implementation intentions on actual and predicted task‐completion times,?
  25. Anchoring and Adjustment in Software Estimation, Jorge Aranda and Steve Easterbrook, ESEC-FSE’05, September 5–9, 2005, Lisbon, Portugal.
  26. Anchoring and Adjustment in Software Estimation, Jorge Aranda, Ph.D. Thesis, University of Toronto, 2005.
  27. Anchoring and Adjustment in Software Project Management: An Experiment Investigation, Timothy P. Costello, Naval Post Graduate School, September 1992.
  28. Anchoring Effect, Thomas Mussweiler, Birte Englich, and Fritz Strack
  29. Anchoring, Non-Standard Preferences: How We choose by Comparing with a Nearby Reference Point.
  30. Reference points and redistributive preferences: Experimental evidence, Jimmy Charité, Raymond Fisman, and Ilyana Kuziemko
  31. Anchoring and Adjustment, (YouTube), Daniel Kahneman. This anchoring and adjustment discussion is critical to how we ask how much, how big, and when.
  32. Anchoring unbound,?Nicholas Epley and Thomas Gilovich.?
  33. They are assessing Ranges and Possibilities,?Decision?Analysis?for the Professional, Chapter 12, Strategic Decision and Risk Management, Stanford Certificate Program.?This book should be mandatory reading for anyone suggesting that decisions can be made in the absence of estimates. They can't and don't accept they can because they can't
  34. Attention and Effort, Daniel Kahneman, Prentice Hall, The Hebrew University of Jerusalem, 1973.
  35. Availability: A heuristic for Judging Frequency and Probability, Amos Tversky and Daniel Kahneman.
  36. On the Reality of Cognitive Illusions,?Daniel Kahneman,?Princeton University,?Amos Tversky,?Stanford University.
  37. Efficacy of Bias Awareness in Debasing Oil and Gas Judgements, Matthew B. Welsh,?Steve H. Begg, and?Reidar B. Bratvold.?
  38. The framing effect and risky decisions:?Examining cognitive functions with fMRI,?Cleotilde Gonzalez, Jason Dana, Hideya Koshino, and Marcel Just, The Journal of Economic Psychology, 26 (2005), 1-20.
  39. ??Discussion Note: Review of Tversky & Kahnemann (1974): Judgment under uncertainty: heuristics and biases,?Micheal Axelsen?UQ Business School?The University of Queensland?Brisbane, Australia
  40. The Anchoring-and-Adjustment?Heuristic,?Why the Adjustments Are Insufficient,?Nicholas Epley and Thomas Gilovich.
  41. ??Judgment under Uncertainty: Heuristics and Biases,?Amos Tversky; Daniel Kahneman,?Science, New Series, Vol. 185, No. 4157. (Sep. 27, 1974), pp. 1124-1131

This should be enough to get you started and set the stage for rejecting any half-baked ideas about anchoring and adjustment, planning fallacies, no need to estimate, and the collection of other cocka-mammy ideas floating around the web on how to make credible decisions with other people's money.

Adam Cherrill, CMC, PfMP

President at Cherrill Consulting Group

2 年

Excellent article Glen. Critical Chain Project Management with feeder buffers and schedule margin help to address this fundamental problem.

Dr. John Malget ARCS MAPM MCMI FSaRS

Capability-based Programme Delivery, System Thinker, Digital Integration Planning, Operating Model Development and Optimisation

2 年

Good article Glen Alleman, wanting it to be true is a major contributor to failure.

要查看或添加评论,请登录

Glen Alleman MSSM的更多文章

  • Five Immutable Principles of Project Success No Matter the Domain, Context, Management Tools, or Processes

    Five Immutable Principles of Project Success No Matter the Domain, Context, Management Tools, or Processes

    Here is a collection of materials I use to guide project success when we are not immune to common reasons for project…

    6 条评论
  • Planning is Everything

    Planning is Everything

    Plans are nothing; Planning is Everything. The notion that plans are nothing but planning is everything is a standard…

    3 条评论
  • Learning from Mistakes is Overrated

    Learning from Mistakes is Overrated

    We've all heard this before: hire good people and let them learn from their mistakes. The first question is, who pays…

    2 条评论
  • Quote of the Day

    Quote of the Day

    “The first rule of any technology used in a business is that automation applied to an efficient operation will magnify…

    3 条评论
  • Quote of the Day

    Quote of the Day

    For the sake of persons of different types, scientific truth should be presented in different forms and should be…

    1 条评论
  • The Fallacy of the Iron Tiangle

    The Fallacy of the Iron Tiangle

    The classic Iron Triangle of lore - Cost, Schedule, and Quality- has to go. The House Armed Services Committee (HASC)…

    9 条评论
  • Why Projects Fail - The Real Reason

    Why Projects Fail - The Real Reason

    At the Earned Value Analysis 2 Conference in November of 2010, many good presentations were given on applying Earned…

    2 条评论
  • Quote of the Day - Risk

    Quote of the Day - Risk

    The real trouble with this world of ours is not that it is an unreasonable world, nor even that it is a reasonable one.…

    6 条评论
  • An Important Newsletter in Our Time of Disinformation

    An Important Newsletter in Our Time of Disinformation

    According to the RAND Report, Truth Decay, Disinformation is Misinformation with Malice. Here's a Harvard Kennedy…

    2 条评论
  • Book of the Month

    Book of the Month

    With the end of the Cold War, the triumph of liberal democracy was believed to be definitive. Observers proclaimed the…

    2 条评论

社区洞察

其他会员也浏览了