Science Communication for Business and Non-Technical Audiences

Science Communication for Business and Non-Technical Audiences

If you have followed me for very long you will know in the past I have shared a number of posts about machine learning and causal inference and how there are important gaps between theory and application.?

In this post I am going to talk about another important gap related to communication. How do we communicate the value of our work to a non-technical audience??

We can learn a lot from formal coursework, especially in good applied programs with great professors. But if not careful we can pick up on mental models and habits of thinking that turn out to weigh us down too, particularly for those that end up working in very applied business or policy settings. How we deal with these issues becomes important to career professionals and critical to those involved in science communication in general whether we are trying to influence business decision makers, policy makers, or consumers and voters.

In this post I want to discuss communicating with intent, paradigm gaps, social harassment costs, and mental accounting.

As stated in The Analytics Lifecycle Toolkit: "no longer is it sufficient to give the technical answer, we must be able to communicate for both influence and change."

Communicating to Business and Non-Technical Audiences?- or -?The Laffer Curve for Science Communication

For those who plan to translate their science backgrounds to business audiences (like many data scientists coming from scientific backgrounds) what are some strategies for becoming better science communicators?? In their book?Championing Science: Communicating your Ideas to Decision Makers?Roger and Amy Aines offer lots of advice.

Two important themes they discuss is the idea of paradigm gaps and intent. Scientists can be extremely efficient communicators through the lens of the paradigms they work in.?

A paradigm is all the knowledge a scientist may have in their head specific to their field of study and research. Unfortunately there is a huge gap between this paradigm and its vocabulary and what non-technical stakeholders can relate to. They have to meet stakeholders where they are, vs. the audience they may find at conferences or research seminars. From experience, different stakeholders and audiences across different industries have different gaps. If you work for a consultancy with external pharma clients they might have a different expectation about statistical rigor than say a product manager in a retail setting. Even within the same business or organization, the tactics used in solving for the gap for one set of stakeholders might not work at all for a new set of stakeholders.?In other words, know your audience. What do they want or need or expect? What are their biases? What is their level of analytic or scientific literacy? How risk averse are they? Answers to these questions is a great place to start in terms of filling the paradigm gaps and importantly speaking with intent.

One challenge discussed by Roger and Amy: "many scientists don't approach conversations or presentations with a real strategic intent in terms of what they are communicating...they don't think in terms of having a message....they need to elevate and think about the point they are trying to make when speaking to decision makers."?

As Bryan Caplan states in his book The Myth of the Rational Voter, when it comes to speaking to non-economists and the general public, they should apply the Laffer curve of learning, "they will retain less if you try to teach them more."

He goes on to discus that its not just what we say, but how we position it, especially when dealing with resistance related to misinformation and disinformation and systemic biases:

"irrationality is not a barrier to persuasion, but an invitation to alternative rhetorical techniques...if beliefs are in part consumed for their direct psychological benefits then to compete in the marketplace of ideas, you need to bundle them with the right emotional content."

In Flawless Consulting, Peter Block discusses how we have a tendency to "cling to the fantasy that if our thinking is clear and logical, our wording eloquent, and our convictions solid, the strength of our arguments will carry the day...[but] clear arguments will help but they are not enough. The client will experience doubts and dilemmas that will block commitment." i.e. its just not enough to get the technical details or the facts or the science right.

In the Science Facts and Fallacies podcast (May 19, 2021) Kevin Folta and Cameron English discuss:

"We spend so much time trying to convince people with scientific principles....it's so important for us to remember what we learn from psychology and sociology (and economics) matters. These are turning out to be the most important sciences in terms of forming a conduit through which good science communication can flow."

Torsten Slok offers great advice in his discussion with Barry Ritholtz about working in the private sector as a PhD economist in the Masters in Business Podcast back in 2018:?

"there is a different sense of urgency and an emphasis on brevity....we offer a service of having a view on what the economy will do, what the markets will do - lots of competition for attention...if you write long winded explanations that say that there is a 50/50 chance that something will happen many customers will not find that very helpful."

So there are a lot of great data science and science communicators out there with great advice. A big problem is this advice is often not part of the training that many of those with scientific or technical backgrounds receive, and an even bigger problem is that it is often looked down upon and even punished! I'll explain more below.

The Negative Stigma of Science Communication in the Data Science and Scientific Community

Here is an example you will commonly see on social media - when someone is trying their best to communicate effectively in the analytical space (and improve their own communication skills). They might do this by sharing some post that attempts to describe some complicated statistical concept in 'layman's' terms - to only be rewarded by harassing and trolling comments. Usually this is about how they didn't capture every particular nuance of the theory, failed to include a statement about certain critical assumptions, or over simplified the complex thing they were trying to explain in simple terms to begin with. This kind of negative social harassment seems to be par for the course when attempting to communicate statistics and data science on social media like LinkedIn and Twitter.

Similarly in science communication, academics can be shunned by their peers when attempting to do popular writing or communication for the general public.?

In 'The Stoic Challenge' author William Irvine discusses Danial Kahneman's challenges with writing a popular book:?

"Kahneman was warned that writing a popular book would cause harm to his professional reputation...professors aren't supposed to write books that normal people can understand."

He describes, when Kahneman's book Thinking Fast and Slow made the New York Times best selling list Kahneman "sheepishly explained to his colleagues that the book's appearance there was a mistake."

In an EconTalk interview with economist Steven Levitt, Russ Roberts asks Levitt about writing his popular book Freakonomics:

"What was the reaction from your colleagues in the profession...You know, I have a similar route. I'm not as successful as you are, but I've popularized a lot of economics...it was considered somewhat untoward to waste your time speaking to a popular audience."

Levitt responded by saying the reaction was not so bad, but the fact that Russ had to broach the topic is evidence of the toxic culture we might face when doing science communication. The negative stigma associated with good science communication is not limited to economics or the social and behavioral sciences or academia.?

In his Talking Biotech podcast episode Debunking the Disinformation Dozen, scientist and science communicator Kevin Folta discusses his strident efforts facing off these toxic elements:

"I have always said that communication is such an important part of what we do as scientists but I have colleagues who say you are wasting your time doing this...Folta why are you wasting your time doing a podcast or writing scientific stuff for the public."

Some of this is just bad behavior, some of it is gatekeeping done in the name of upholding the scientific integrity of their field, some of it is the attempt of others to prove their competence to themselves or others, and maybe some of it is the result of people genuinely trying to provide peer review to their colleagues that they think have gone astray. But most of it is unhelpful when it comes to influencing decision makers or improving general scientific literacy. It doesn't matter how great the discovery, how impactful the findings, we have all seen from the pandemic that effective science communication is critical for overcoming the effects of misinformation and disinformation. A culture that is toxic toward effective science communication becomes an impediment to science itself and leaves a void waiting be filled by science deniers, activists, policy makers, decision makers, and special interests.

This can be challenging when you add the?Dunning-Kruger?effect to the equation. Those that know the least may be the most vocal while scientists and those with expertise sit on the sidelines. As Bryan Caplan states in his book The Myth of the Rational Voter:

"There are two kinds of errors to avoid. Hubris is one, self abasement is the other. The first leads experts to over reach themselves; the second leads experts to stand idly by while error reigns."

How Does Culture and Mental Accounting Impact Science Communication?

So as I've written above, in the scientific community there is sort of a toxic culture that inhibits good science communication. In the Two Psychologists Four Beers podcast ?(WARNING: the intro of this podcast episode contains vulgarity) behavioral scientist Nick Hobson makes an interesting comparison between MBAs and scientists.?

"as scientists we need to be humble with regards to our data...one thing we are learning from our current woes of replication (the replication crisis) is we know a lot less than we think. This has conditioned us to be more humble....vs. business school people that are trained to be more assertive and confident."

I'd like to propose an analogy relating to mental accounting. It seems like when a scientist gets their degree it comes with a mental account called scientific credibility. In comparison, MBAs for example don't have a mental account called scientific credibility. They aren't long on academic credibility so they don't require putting on the communication hedges the way scientists often do. They come off as better communicators and more confident while scientists risk becoming stereotyped as unable to be effective communicators.?

To protect their integrity and avoid social harassment from their peers, scientists may tend to speak with caveats, hedges, and qualifications. This may also mean a delayed or confusing response. Before even thinking about communicating results in many cases requires in depth rigorous analysis, sensitivity checks etc. It requires doing science which is by nature slow while the public wants answers fast. Faster answers might mean less time for analysis which calls for more caveats. This can all be detrimental to effective communication to non-technical audiences. Answers become either too slow or too vague to support decision making (recall Torsten Slok's comments above). It gives the impression of a lack of confidence and relevance and a stereotype that technical people (economists, scientists, data scientists etc.) fail to offer definitive or practical conclusions. As Bryan Caplan notes discussing the role of economists in The Myth of the Rational Voter:

"when the media spotlight gives other experts a few seconds to speak their mind, they usually strive to forcefully communicate one or two simplified conclusions....but economists are reluctant to use this strategy. Though the forum demands it they think it unseemly to express a definitive judgement. This is a recipe for being utterly ignored."

Students graduating from highly technical programs may inherit these mental accounts and learn these 'hedging strategies' from their professors, from the program, and the seminar culture that comes with it.

Again, Nick Hobson offers great insight about how to deal with this kind of mental accounting in his own work:

"what I've wrestled with as I've grown the business is maintaining scientific integrity and the rigor but knowing you have to sacrifice some of it....you have to find and strike a balance between being data driven and humble while also being confident and strategic and cautious about the shortcuts you take."

In Thinking Fast and Slow, Kahneman argues that sometimes new leaders can produce better results because fresh thinkers can view problems without the same mental accounts holding back incumbents.?The solution isn't to abandon scientific training and the value it brings to the table in terms of rigor and statistical and causal reasoning. The solution is to learn how to view problems in a way that avoids the kind of mental accounting I have been discussing. This also calls for a cultural change in the educational system. As Kevin Folta stated in the previous Talking Biotech Podcast:

"Until we have a change in how the universities and how the scientific establishment sees these efforts as positive and helpful and counts toward tenure and promotion I don't think you are going to see people jump in on this."?

Note: An prior version of this article including links to more references appeared on Econometric Sense. https://econometricsense.blogspot.com/2021/06/science-communication-for-business-and.html

References:

Championing Science. Communicating Your Ideas to Decision Makers. Roger D. Aines, Amy L. Aines. University of California Press. January 2019

The Myth of the Rational Voter: Why Democracies Choose Bad Policies. Bryan Caplan. Princeton University Press. 2007.

Flawless Consulting:A Guide to Getting Your Expertise Used. Peter Block. (3rd Ed.)

The stoic challenge : a philosopher's guide to becoming tougher, calmer, and more resilient. William Braxton Irvine. Norton & Co. NY. 2019

The Analytics Lifecycle Toolkit: A Practical Guide for an Effective Analytics Capability. Gregory S. Nelson. 2018.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了