Making Decisions in the Presence of Uncertainty
https://executiveeducation.wharton.upenn.edu/thought-leadership/wharton-at-work/2021/08/decision-making-under-uncertainty/

Making Decisions in the Presence of Uncertainty

Maurice Schweitzer, Wharton professor of Operations, Information and Decisions and academic director of?Effective Decision Making: Thinking Critically and Rationally, says there needs to be more awareness about dealing with uncertainty.?“Leaders constantly make decisions absent complete information and often underappreciate how random and uncertain the world is. When you fail to account for uncertainty appropriately, you can make some serious errors.” [1]

Decision theory is concerned with the problem of making decisions. Statistical?decision theory is decision-making in the presence of statistical knowledge by?understanding some of the uncertainties involved in the problem.

Decision theory deals with situations where decisions have to be made in the presence of uncertainty, and its goal is to provide a rational framework for dealing with such situations. To make good choices, we must calculate and manage the resulting risks from those choices. Today, we have the tools?to perform these calculations.

A few hundred years ago, decision-making in the presence of uncertainty and the resulting risk had only tools of faith, hope, and guesswork. This is because risk is a numbers game. Before the 17th?century, our understanding of numbers did not provide us with the tools needed to make choices in the presence of uncertainty.

A good book about the history of?making choices in the presence of uncertainty?- risk management - is?Against the Odds, The Remarkable Story of Risk?by Peter Bernstein. These efforts culminated in?Bernoulli's focus not on probabilistic events but on human beings who desire or fear certain outcomes to a greater or lesser degree.

Bernoulli?showed how to create mathematical tools to allow anyone to “estimate his prospects from any risky undertaking in light of [his] specific financial circumstances.” This is the basis of?the Microeconomics?of decision-making, in which the opportunity cost of a collection?of choices can be assessed by estimating both the cost of that decision and the result beneficial outcome or loss.

In 1921, Frank Knight distinguished between?risk,?when the probability of an outcome is possible to calculate —?or is knowable —?and?uncertainty,?when the probability of an outcome is not possible to determine —?or is unknowable.

This becomes an argument that rendered insurance attractive and entrepreneurship?tragic.?20 years later, John von Neumann and Oskar Morgenstern established the foundation of game theory, which deals in situations where people’s decisions are influenced by the unknowable decisions of?live variables?— in the gaming world, this means?other people.

Decision-making in the presence of uncertainty is a normal business function and a normal technical development process. The world is full of uncertainty.

Those seeking certainty will be satisfied. Those conjecturing that decisions can't be made in the presence of uncertainty are woefully misinformed.?

Along with all this woefulness is the boneheaded notion that estimating is guessing and that decisions can actually be made in the presence of uncertainty in the absence of estimating.

Here's why. When we are faced with a choice between multiple decisions, a choice between multiple outcomes, each is probabilistic. If it were not - that is we have 100% visibility into the consequences of our decision, the cost involved in making that decision, the cost impact or benefit impact from that decision - it's no longer a decision. It's a choice to pick between several options based on something other than time, money, or benefit.

Buying an?ERP?system, or funding the development of a new product, or funding the consolidation of the?data center?in another city is a much different?choice?process than picking apples. These decisions have uncertainty. Uncertainty of the cost.?Uncertainty?of the benefits, revenue, savings, increase in reliability, and maintainability. Uncertainty?in almost every variable.?

Managing in the presence of uncertainty and the resulting risk, is called business management. It's also called how adults manage projects (Tim Lister)

The Presence of Uncertainty is one of the most Significant Characteristics of Project Work.

Managing in the presence of?uncertainty?is?unavoidable. Ignoring this uncertainty is also unavoidable. It's still there, even if you ignore it.?Uncertainty?comes in many forms.

  • Statistical uncertainty?- is created by Aleatory uncertainty from stochastic process (weather is a good example of stochastic uncertainty. Unless you're a deity, you're not going to correct the source of risk). Only margin can address the outcomes of this uncertainty.
  • Probabilistic uncertainty - is incomplete knowledge created by Epistemic Uncertainty. This lack of knowledge can be improved with effort using modeling to understand the range and likelihood of outcomes and to place quantitative bounds on uncertainty in results.
  • Subjective judgment?- bias, anchoring, and adjustment.
  • Systematic error?- need for understanding of the reference model.
  • Temporal variation?- instability in the observed and measured system.
  • Inherent?stochasticity?- instability between and within collaborative system elements

So Back To the Problem at Hand

?If credible decisions are to be made in the presence of uncertainty (aleatory, epistemic, and ontological), then we need the information to address the sources of that uncertainty in the bulleted list above.

This information can be obtained through a variety of means. Modeling, sampling, parametrically, past performance, reference classes. Each of these sources has in itself an inherent uncertainty.?

So, in the end, it comes done to this...

To make a credible decision in the presence of uncertainty, we need to estimate the factors that go into that decision.

Aleatory and Epistemic Uncertainty Create Project Risk

All projects operate in the presence of Uncertainty

This uncertainty?is unavoidable. One might actually make the case in the Agile paradigm that uncertainty is desirable. Otherwise, how can?Emergence?add value to the deliverables?

In the presence of this uncertainty, the design, development, and deployment of software must rely on estimations, forecasts, and predictions based on an?idealized?understanding of what is an?unknown?(but knowable)?future desired outcome. Guiding the work efforts toward a future is based on a?Product Roadmap, no matter the development method - be it Agile or Traditional. Without such a?Roadmap,?the developers and managers of the development are just wandering around looking for a solution for an ill-defined problem.

If this future reality is?unknowable, you have a bigger problem and are headed for failure. Emergence plays a role in all development processes, but emergence without a goal is called?Research,?not?Development. And?Development?has specific business goals for?breakeven,?ROI, IRR, Cash Flow,?and other financial performance measures needed to run the business successfully.

So let's focus on the impacts of uncertainty in the development paradigm and leave the research alone for now.

There are two broad types of uncertainty on all projects, and on software projects, these two types drive very different responses.

  • There is uncertainty?associated with the natural randomness of the underlying processes of writing software.
  • Uncertainty is associated with the?model?of the real world the software operates in because of insufficient or imperfect knowledge of reality.

These two types have fancy names.

  • Aleatory uncertainty.
  • Epistemic uncertainty.

The two types of uncertainty may be combined and analyzed as a total uncertainty or treated separately. In either case, the principles of probability and statistics apply equally.

Aleatory Uncertainty

No alt text provided for this image

The?Alea?in?Aleatory?is Latin for the single Die, which the Greeks used to gamble.

This means aleatory uncertainty is an?Inherent Randomness.?

This data-based uncertainty is?associated with the inherent?variability?of the basic information?of the real-world development processes.

These uncertainties?cannot be reduced, they are just part of the development process. They are?irreducible,?and the only approach to dealing with them is to have?a margin. Schedule margin, cost margin, and performance margin.

By?data-based, it means that the randomness is in the data generated by stochastic processes. A stochastic process refers to a family of?random variables?indexed against another variable or set of variables. For example, the duration of work activity is a statistical process. That duration can take on many values depending?on the underlying?model of the work. We can have a narrow range of values for the duration. Or a wide range of values, depending on the underlying processes.

Many software project phenomena or processes of concern to developers contain randomness. The expected outcomes are unpredictable (to some degree). Such phenomena can be characterized by field or experimental data containing significant variability representing the natural randomness of an underlying phenomenon. The observed measurements are different from one experiment (or one observation) to another, even if conducted or measured under identical conditions.

There is a range of measured or observed values in these experimental results, and, within this range, certain values may occur more frequently than others. The variability inherent in this data or information is statistical in nature, and the realization of a specific value (or range of values) involves probability.

This is why measures like?velocity?are very sporty since?past performance is rarely like future performance in the presence of Aleatory Uncertainties (as well as Epistemic Uncertainties) of actual project work.

Epistemic Uncertainty

The term epistêmê in Greek means?knowledge.

No alt text provided for this image
Apeth Keasoy, actually Areti or Virtue https://tarihvearkeoloji.blogspot.com/2015/07/celsus-kutuphanesi-efes.html

Epistemic?uncertainty reflects our lack of knowledge of the processes that influence events.

This?lack of knowledge?is a?probabilistic?assessment of some outcome, usually an?event-based?outcome.

There is a 40% chance of rain in the forecast area for tomorrow?is an Aleatory uncertainty.

We assign probabilities to events, probabilities?to the work activities that create the knowledge?needed to assess the uncertainty, and probabilities of the residual uncertainties after our new knowledge has been acquired.

In practice, we can assign a?mean?or a?median?value to this uncertainty. That's what the weather forecast does. That 40% chance of rain is usually a?mean?value. Where we live, when we hear a 40% chance in Boulder County, we know we have a lower probability?because of our micro-climate. That weather forecast is?over the forecast area?and may be much different depending on where you live in that area.

This forecast also includes inaccuracies and imprecisions in the?prescribed forms of the probability distributions?and all the parameters of the estimates. This is why?forecasting the weather in some parts of the world is a very?sporty?business. In places like Los Angeles, it's easy - as shown in the movie?LA Stories, where Steve Martin is the bored weatherman. Here in Colorado, with our mountain weather, making a forecast a few days from now is likely to be a challenge. As they say,?don't like Colorado weather? Wait a few hours, it'll change.

Some Challenges to Managing in the Presence of Uncertainty

The primary issue with all uncertainties is the?communication?of the accuracy and precision of the risk created by the?aleatory?and?epistemic?uncertainty.?

  • What is the scope of the uncertainty?
  • What risks does it create to the success of the software development effort?
  • Is the uncertainty time-dependent?
  • At what level of decomposition of the project is the uncertainty applicable?

This is a?Risk Communication?issue. So let's restate the two forms of uncertainty

  • Aleatory uncertainty: the uncertainty inherent in a nondeterministic (stochastic, random) phenomenon… is reflected by modeling the?phenomenon in terms of a probabilistic model… Aleatory uncertainty cannot be reduced by accumulating more data or additional
  • information.
  • Epistemic uncertainty: the uncertainty attributable to the incomplete knowledge about a phenomenon that affects our ability to model the uncertainty reflected in ranges of values for parameters, a range of viable models, the level of model detail, multiple expert interpretations, and statistical confidence. The accumulation of additional information can reduce uncertainty.

What Does This Mean for Software Development Working in the Presence of Uncertainty?

If you accept that all software development work operates in the presence of Aleatory and Epistemic uncertainty, then ...

No decisions can be made in the presence of these two types of uncertanties without estimating the impact of your decision on the project

This is a simple, clear, concise principle of managing in the presence of uncertainty. Anyone suggesting that decisions can be made without estimating has to ignore this principle willfully, OR the project is de minimus - meaning it's of no consequence to those paying if the project is late, over budget, or the delivered outcomes don't meet the needed performance level for the project to earn its?Value?in exchange for the?Cost?to produce that?Value.

For Those Interested in Underling Mathematics Here are Some Gory Details

No alt text provided for this image

In All Cases of Uncertainty, We Need To Estimate

?There's no way out of it. We can only make a credible decision of importance with an estimate of the impact of that decision, the cost incurred from making that decision, the potential benefits from that decision, and the?opportunity cost?of NOT selecting an outcome from a decision.

Anyone suggesting we can make decisions without estimating needs to provide clear, concise, actionable information with examples of how this can be done in the presence of uncertainty created by he underlying statistical processes of project work and the resulting probabilistic outcomes of those processes.

If you have certainty, you don't need to estimate. Measure your empirical performance to date, using the?Most Likely?value from that performance and the variance project the future performance. Ignore risk, ignore naturally occurring (aleatory) and event-based (epistemic) uncertainties in the underlying processes, and spend your customers' money by applying faith, hope, and guesswork, just like before the 17th?century.

[1] The New Leadership: Decision Making Under Uncertainty, Wharton Executive Education, August 2021

[2] Understanding Aleatroy and Epistemic Parameter Uncertainty in Statistical Models, N. W. Porter and V. A. Mousseau, Sandia National Laboratories, PO Box 5800 Albuquerque NM 87185-0748

要查看或添加评论,请登录

Glen Alleman MSSM的更多文章

  • Researching with AI (Part 2)

    Researching with AI (Part 2)

    The ChatGDP summarized the Origin and Adoption of the 49 Steps in Model and SimulationDevelopment. Origin and Adoption…

    1 条评论
  • Researching with AI (Part 1)

    Researching with AI (Part 1)

    I'm writing a book for Taylor & Francis on Deploying Digital Engineering Systems. One of the chapter sections is on the…

  • PM World Journal Articles

    PM World Journal Articles

    I'm collecting materials for a Digital Engineering Strategy implementation in Government and Commerical Domains. So…

    1 条评论
  • Quote of the Day

    Quote of the Day

    "Wealth breeds a class of people for whom human beings are disposable commodities. Colleagues, employees, kitchen…

  • An Interview

    An Interview

    An interview with Michael Clayton https://www.youtube.

    3 条评论
  • Quote of the Day

    Quote of the Day

    “As Americans, we should be frightened—deeply afraid for the future of the nation. When good men and women can’t speak…

    4 条评论
  • 3 - Workforce Plan for Deploying Digital Engineering

    3 - Workforce Plan for Deploying Digital Engineering

    Digital Engineering is a fundamental change to the way people work and operate. It incorporates digital computing…

    1 条评论
  • 2 - Fundamentals of Digital Engineering Systems

    2 - Fundamentals of Digital Engineering Systems

    This is the 2nd in a 3-part series on Digital Engineering. The 1st introduced the Capabilities of Digital Engineering.

  • Some GovLoop Publications

    Some GovLoop Publications

    GovLoop is The Knowledge Network for the Government of more than 300,000 federal, state, and local government peers in…

  • Five Immutable Principles of Project Success No Matter the Domain, Context, Management Tools, or Processes

    Five Immutable Principles of Project Success No Matter the Domain, Context, Management Tools, or Processes

    Here is a collection of materials I use to guide project success when we are not immune to common reasons for project…

    6 条评论

社区洞察

其他会员也浏览了