Don't look up - 7 lessons for communicating risk & uncertainty

Don't look up - 7 lessons for communicating risk & uncertainty

For those who haven’t seen it, “Don’t Look Up” is a movie about how the human race responds when they discover that a large comet is on a collision course with Earth. You can find the movie on Netflix. One aspect that I enjoyed about the movie was how it dealt with risk and uncertainty in decision-making. The following analysis relates to a scene in which the scientists who discovered the comet first met the US President and her advisers. Could the scientists adequately convey the gravity of the situation (pun intended)? There are lessons for people who communicate risk and uncertainty with colleagues, clients, shareholders or boards.

Lesson 1: clearly articulate you main point first

The starting point for the scene was that the scientists were almost certain that the comet would impact the Earth and that the results would be catastrophic. In contrast, the President appears to have come to the meeting having had only a cursory briefing. She was also clearly distracted by other things – a dodgy Supreme Court nomination among them. For there to be a meeting of the minds a large gap had to be bridged.

Early in the discussion one of the scientists started describing how they had incorporated uncertainty into their calculations by using ‘average astronomic uncertainty of 0.04 arc seconds’. (Presumably this refers to measurement error when observing an object through a telescope). Shortly after making this reference the scientist was cut off by the President and her Chief of Staff. There was a time to discuss this detailed assessment of uncertainty, and now was not that time; the scientist hadn’t yet conveyed his main point. He needed to first tell the President that this comet was a ‘planet killer’, not to regale her with the finer details of astrophysical calculations.

Perhaps this point was exaggerated in the movie by the scientist being hopelessly devoid of communication skills. However, the need to convey the main point clearly and quickly aligns with decision-making research and with real-world experience that highlights the benefits of ‘layering’ detailed and complex information to ensure that the audience doesn’t get overwhelmed and lost in the detail.

Lesson 2: be conscious of the risk-related messages you unintentionally convey

Having failed at their first attempt to convey the small amount of uncertainty that they perceived in their calculations, the scientists then provided a number of mixed messages. For example, they told the President that the comet would hit 62 miles west of the coast of Chile. In the context of the large distances that the comet still needed to travel, 62 miles sounded quite precise. Could the President safely assume that the comet might hit 70 miles off the coast of Chile, but that it certainly wouldn’t strike New York? In any event, given that they believed that the whole planet would be destroyed, the precise location of impact was academic.

The scientists continued: the comet would create mile-high tsunamis and impact with the force of a billion Hiroshima bombs. In contrast to the impact location, these apparently rounded number conveyed fuzziness. Would the tsunamis really be a mile high, or could they be 5 miles, or maybe just 100 feet? As a general rule, precise numbers suggest certainty whereas rounded ones suggest the opposite.

In the real-world these types of inherent messages can create both problems and opportunities. For example, long-term investment projections that are provided to two decimal places could lead investors to under-estimate the risk that they face. In contrast, those same two decimal places in a price negotiation could convey a potentially helpful message that the bidder has undertaken a detailed calculation from which they are not inclined to budge.

Lesson 3: be careful about suggesting that anything is 100% certain

Eventually, the President asked a straight-forward question: ‘how certain is this?’ Unfortunately, one of the scientists responded ‘there’s 100% certainty of impact’. This appears out of character; the same scientist had previously mentioned the arc second uncertainty and clearly had a penchant for sticking to the detail. And, as one of the other characters later noted ‘scientists never like to say 100%’. Nonetheless, it’s good reflection of other real-world scenarios in which outcomes are sometimes presented as being certain to occur when they are not. These assessments can be dangerous. As Mark Twain reportedly quipped, ‘it ain't what you don't know that gets you into trouble.?It's what you know for sure that just ain't so.’ Decision-making research confirms that, as a general rule, the higher your level of subjective certainty, the greater the risk of overconfidence.

In the movie the President was riled at the suggestion of certainty and the scientist was forced to quickly clarify: the chance was not 100%, by their calculations it was actually 99.78%. In real life, beset as it is with its earthly uncertainties and complexities, even this level of confidence would warrant rigorous processes of checks and balances to ensure it wasn’t a gross over-estimate.

Lesson 4: Clarify vague risk-related terms

After reflecting on the fact that the comet strike was not completely certain, the President and her advisers considered how best to tell the general population about the comet. One suggestion was to refer to the comet strike as a ‘potentially significant event’. Perhaps not surprisingly, the use of this type of vague language has been shown to be ineffective at communicating risk. Research has demonstrated that people have widely varying views about the probability conveyed by words such as ‘possibly’. They even have different interpretation of seemingly unambiguous words such as ‘always’.

In the movie, one of the scientists protested that the comet impact wasn’t ‘potentially going to happen’; rather, it was going to happen. This reflected one interpretation of the proposed wording, but others were also possible. For example, perhaps the comet strike itself was almost certain, but its effect on the planet was less certain. Viewed through this lens, the point wouldn't be that there was some potential for an event to occur where that event would certainly be significant; rather, it was that an event would certainly occur that had some potential to be significant. Furthermore, being able to assess the likelihood of a significant event occurring requires a shared definition of significance. To your typical non-Chilean voter, does Chile being wiped off the map by a giant tsunami count as ‘significant’?

While in the movie obscuring the facts might have been intentional, for anyone who is genuinely trying to have a meaningful conversation about risks, more precise terminology (or quantification) is required. It’s too easy for people to come away with different interpretations of vague terms.

Lesson 5: counter the 3-bucket interpretation of risk

One of the potential problems associated with vague wording and lack of quantification is that it can allow people to default to thinking about risk in terms of three buckets. Rather than conceptualising risk as a probabilistic continuum, a simplistic 3-bucket approach categorises risks into 1) the things that definitely won’t happen, 2) the things that definitely will happen and, 3) in between, all the things that might happen. The result is that, while the difference between a 100% risk of impact and 99.78% might be tiny, the resultant recategorisation from one bucket to another creates a material change in perceived risk. The Chief of Staff’s exclamation (‘so it’s not 100%’) suggested that he was victim to this simplistic thinking.

What could the scientists have done to counteract this effect? Perhaps nothing. Or perhaps they could have articulated the risk in terms of frequencies instead of percentages. What if the scientists had said something like ‘if we had discovered 10,000 comets like this one then 9,978 would impact the Earth’. Frequencies are typically easier to conceptualise and more concrete than are percentages, making it easier to think about risk and uncertainty.

Lesson 6: see risk from the other’s person’s perspective

From the President’s perspective, the certainty of impact was lower than the 99.78% chance that the scientists calculated. Much to the scientists’ chagrin, the President assessed the risk as being only 70%. It would be easy to believe that this discounting was simply a deliberate failure on the part of the President to recognise the real risk the scientists had articulated. This might be part of the truth. However, a more sympathetic interpretation is also possible; from the President’s perspective perhaps the real probability was actually closer to 70%.

From the President’s perspective the main risk was not that the telescope measurements were incorrect by 0.04 arc seconds. For her, other risks were more important. How could she know whether these scientists could be relied on? From her perspective they were ‘just two people that walked in here’. Taking the ‘outside view’ she assessed that the ‘base rates’ for people walking into her office and telling her that the world is ending were low (although she didn't express the decision-making concepts in those terms). Rather, she asked ‘do you know how many ‘the world is ending’ meetings that we’ve had over the years?’. They listed a few: economic collapse, loose nukes, car exhaust killing the atmosphere, rogue AI, drought, famine, plague, population growth, hole in the ozone. While the scientists’ inside view suggested that the risk was 99.78%, the outside view suggested a much lower value.

The President would be justified in weighing the outside view in her considerations, and balancing that view against the inside view presented by the scientists. Given that the outside view is often overlooked and often under-represented in people’s decision-making, perhaps she should even be applauded for doing so ... at least until she had time to gather more information (at which point she she move more toward the scientists' inside view).

Lesson 7: establish credibility & engagement

It was evident from the dialogue that the scientists didn’t have much credibility with the President. Presumably in the real world the pre-vetting process for people who obtain a 20-minute audience with the President would ordinarily confer some credibility. Not just anyone can meet the President, right? However, in the movie the conversation devolved into discussions about titles and the standing of one university’s astronomy department over another. One of the scientists even quipping that she would provide her SAT scores if that helped.

What might have helped would have been a slick computer simulation like that used by tech guru Peter Isherwell later in the movie (when describing his approach to the comet). Peter’s presentation was the opposite to that of the scientists: intellectually flawed but dripping with engagement and apparent credibility. For example, Peter was able to name-drop a Nobel and Polonsky prize-winning scientists as part of his presentation, while at the same time failing to mention that Polonsky prize is awarded for ‘creativity and originality in the humanistic disciplines’ and therefore seems to have little relevant for astrophysics or astronomy.

Communicators need to ensure that they create credibility when it is warranted; recipients of that communication need to be able to see through it when it is not.


To receive more articles like this, you can subscribe here.

Simon Russell

Behavioural finance - author, speaker, consultant

2 年

On a similar theme, check out my podcast recording with Mike van De Graaf from earlier today, "How to communicate risk - reflections from real life and Hollywood" https://soundcloud.com/simonrussellbfa/how-to-communicate-risk-with-mike-van-de-graaf https://podcasts.apple.com/au/podcast/behavioural-finance-with-simon-russell/id1584079998

回复
Simon Russell

Behavioural finance - author, speaker, consultant

2 年

For those interested in this topic, please join us on Friday at 12:30 when?Mike van de Graaf?and I will discuss lessons from real life and from recent Hollywood movies about how to communicate risk in a way that engages key stakeholders. The discussion will hopefully be both fun and useful and should be relevant for risk managers, as well as for professional investors and corporate decision-makers. You can register here: https://www.dhirubhai.net/events/howtocommunicaterisk-reflection6893708941268787200/about/

Trevor Hunt FAICD

Deputy Chair and Chair of Finance Risk and Audit Committee genU/ Chair Police and Nurses Limited Risk Committee/ Chair AICD WA Regional Forum/ Chair Denmark Futures

2 年

Some interesting points in this for risk professionals and relevant to communication around the pandemic. Avoid the rookie errors!

Rebecca Trepezanov

Risk Management | Strategy | Operations

2 年

Deborah Knight something for us to check out!

回复

Just one small observation. In your great piece you open with "...For those who haven’t seen it, “Don’t Look Up” is a movie about how the human race responds when they discover that a large comet is on a collision course with Earth." It is not a movie about how the human race responds to the existential threat, it is about how a populistic, quasi-democratic government that is controlled by corporate donor money and caught in an election battle with bad public opinion polls, responds to an existential crisis. That movie would look quite different if it was made in France, China, Russia, Iran, Peru or any other country. As a non-US citizen, this movie shows the weakness of US civil society and government institutions. Still recommend everyone watching it.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了