Put Friction on Liars Not Just Their Lies
Image Credit: Saturday Evening Post https://www.saturdayeveningpost.com/2018/05/8-historys-destructive-lies/

Put Friction on Liars Not Just Their Lies

Op-Ed: Put Friction on Liars, Not Just Their Lies

Author: Marshall Van Alstyne

If truth is the first casualty of war, then lies should not surprise us when politics become a battlefield. The year 2020 is a low point in the integrity of our political discourse. Partisans cannot hear one another. If they listen, it is not to comprehend but to cut down. When they speak, it is not to communicate but to conquer. Fact checks bear no weight: liars only shift tactics. Rather than support their “alternative” facts, they attack the credibility of those who would check them. Unchecked, their new lies compound the old. A miasma of misinformation afflicts America, Brazil, England, France, India, and Venezuela among others. Well-functioning societies depend on honest information. Facts must flow freely, available to all, uncluttered by noise, with means of validation.

What is to be done? If solutions were easy, we would have them. They are not. To begin our conversation, I propose one.

Platforms must put social friction on liars and not just their lies.

One clear cause of our information pollution problem is that lies spread “farther, faster, deeper and more broadly than truth in all categories of information.”[1] Platforms spread a wildfire of lies in order to build businesses based on engagement. If the goal is to attract eyeballs, then flames will do the trick.

Platforms must stop promoting blowhards in the name of newsworthiness. A Cornell study of 38 million articles found that the single greatest source of coronavirus misinformation – including that disinfectant is a cure, that an anti-malaria drug is a cure,[2] and that masks are not effective at reducing viral spread – was the US president.[3] False claims undermining integrity of the U.S. 2020 election reached the point of rebuke from members of the same party.[4]

Partisanship is not the problem when we ask our institutions to reject falsehoods that contradict basic science.  Yet, again, we should not be surprised that liars, as liars, complain of bias when they cannot spread lies. Their argument is disingenuous and self-sealing. Because a liar can always cry “Bias!” regardless of filter used, their charge implies that no lie should be policed. Laws already reject this argument in cases of fraud, libel, incitement to violence, and voting disinformation. If laws reject this, then platforms should too.

An easy way to implement a “social friction” policy is to selectively reverse platform amplification. Platforms announce publicly that anyone caught lying will have their social networks trimmed and their messages delayed. Does a person have 100,000 followers? Following a lie, it will be 50,000 and messages will go out next week. Shaming should be part of the solution. Penalties can be temporary for good behavior but double down for bad behavior. Lying again means having 25,000 followers and messages go out every other week. Liars can still say what they wish, even to the point of lying, but then followers would need to go looking for misinformation in contrast to having the platform promote it. 

This has three benefits.  First, social friction directly addresses the problem that lies spread faster, farther and more broadly than truth. It filters pollution at the source. It reduces and delays the channels through which lies spread. Second, social friction motivates liars to change their behavior. If the goal of a blowhard or ideologue is to attract attention or to move an audience, what better motivator is there than limiting audience access? Labeling and deleting do not work. Unmotivated by truth and integrity, they rephrase and repost. Undeterred and without penalties, ideologues with large networks volley and amplify each other’s false claims.[5] When social friction applies to liars, ideologues choose to shrink their own audiences by telling lies. Networks of echo chambers that willfully propagate lies then self-destruct as they willfully take themselves down. What we have needed is a mechanism that disproportionately weeds out untruths as compared to truths when, up to now, we have had the opposite. Going forward liars render themselves impotent.

This highlights a third benefit: This policy places the burden on the proper source, the liar rather than the platform or the reader. Too many attempts at solutions insist that platforms mediate 500 million daily messages[6] that they do not author – do we want them judging every message? Can they?  Or, proposals ask readers to sift through mountains of manure to find the truth – will they?  Who knows better that a claim is false than the author of that claim? Putting social friction on liars causes authors to think twice before pushing what they know to be false.

Sadly, because platforms need engagement, they amplify liars and clog our communications with noise. We must ask them to apply social friction to liars even as they socially amplify others.  Our societies, our sanities, and our democracies depend on it.


[1] https://science.sciencemag.org/content/359/6380/1146.full

[2] This was never seriously in doubt. Malaria is caused by a unicellular parasite whereas covid-19 influenza is caused by a virus. Cellular organisms and viruses operate in different ways. Unsurprisingly, antiviral drugs are more effective than antiparasitic drugs in treating covid-19 https://www.cidrap.umn.edu/news-perspective/2020/10/new-covid-studies-remdesivir-yes-hydroxychloroquine-no .

[3] https://allianceforscience.cornell.edu/wp-content/uploads/2020/10/Evanega-et-al-Coronavirus-misinformation-submitted_07_23_20.pdf

[4] Dorman, S. “Trump’s voter fraud remarks draw criticisms from some Republicans,” Fox News, Nov. 5 2020. “STOP spreading debunked information… This is getting insane” Representative Adam Kinzinger of Illinois. There is “no defense for the President’s comments tonight undermining our democratic process,” Governor Larry Hogan of Maryland.

[5] Zakrzewski, C. “Trump’s Twitter feed is covered in warning labels,” Washington Post. Nov. 5, 2020. https://www.washingtonpost.com/politics/2020/11/05/technology-202-trump-twitter-feed-is-covered-warning-labels/

[6] https://www.internetlivestats.com/twitter-statistics/

Nikolai Makaranka

Director, Manufacturing Analytics at Bristol Myers Squibb

3 年

Prescient thoughts! Hopefully this kind of mechanisms will become a norm for social media, because (self-)regulation increasingly becomes an existential question for them.

回复
Frederic H.

Corporate Enterprise Architect @ Belfius | Legal Engineering Enthusiast | Digital Assets Engineering

4 年

I am afraid opinions cannot be regulated this is a political regime that I don't dream of. Knowing that social networks can easily manipulate our moods and opinons see (Experimental evidence of massive-scale emotional contagion through social networks, Adam D. I. Kramer,?Jamie E. Guillory, and?Jeffrey T. Hancock, https://www.pnas.org/content/111/24/8788), I would be worried about having any authority above free opinions would It be lies. People have free will and should be able to choose what they engage with. If humans decide they rather a society of liars that's all up to them. BTW we already had such a society since the raise of personal branding, fake gurus (Tonny Robbins, Tim Ferris, Rob. Kyosaki, shia labeouf...), lobbys and no one complained because It was only in the B2C world. I think B2B world is worried by micro marketing plus viral marketing and being served a taste of their own medicine. My solution would be to raise investment and researches in Agnotology.

回复
Marshall Van Alstyne

Allen & Kelli Questrom Chair Professor in IS, Boston University | Digital Fellow MIT | Thinkers 50

4 年

For anyone who's interested, here is another idea: A "Market for Truth" that uses information economics to screen the fact from the fiction. See "The Price of Lies" https://thinkers50.com/blog/the-price-of-lies/

Paul King

Sr. Data Science Manager at Apple; fmr Computational Neuroscientist, Software Technologist

4 年

Online incentive structures that attenuate dishonesty are a great direction, the problem is how to make honesty determinations. An idea for that would be to create a set of non-gameable mechanisms for building a reputation as a reliable source. This could be solved by identifying politically left and right cohorts and giving extra reputation points to people whose comments are "liked" by both sides (as representing likely consensus views). Do generalize this beyond American politics, an algorithm could identify pockets of division in the social graph -- echo-chamber social groups that exhibit bipartite segmentation of opinion -- and reward the reputations of people supported by otherwise opposing echo-chamber populations. This reputation signal could then be used in content weighting for distribution amplification or attenuation.

Kevin Morrell

Rowlands Chair in Transformational Strategy at Cranfield School of Management

4 年

Very thought-provoking

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了