Does social media censorship cause extremism?
John Koetsier
VP Insights @ Singular, senior contributor @ Forbes, host @ Growth Masterminds podcast & the TechFirst podcast. Angel investor & advisor. I talk to smart people and learn from them.
Why are we so divided? Whether it’s the war in Ukraine or Covid or the 2020 U.S. election or Black Lives Matter or abortion, it feels like there have never been such great divisions in society.
I recently had an opportunity to speak with Daryl Davis, a blues, jazz, rock, and swing musician who played for Chuck Berry for 32 years. He’s also a black man who has convinced 200 members of the KKK that racism just doesn’t make sense. And Davis, who I spoke to along with alternative social network Minds.com CEO Bill Ottman, has some ideas about what allows extremism to flourish.
“It’s when the conversation ceases that the ground becomes fertile for violence,” Davis says on the?TechFirst podcast. “A missed opportunity for dialogue is a missed opportunity for conflict resolution ... if you spend five minutes with your worst enemy, you’ll find something in common. And that chasm, that gap begins to narrow. Spend another five minutes, you find more in common and it closes in more.”
There’s a strong perception among people who identify with the right side of the political spectrum that the major social platforms from big tech companies censor or limit their political speech. Former president Donald Trump?launched?a class action lawsuit against Facebook, Twitter, and YouTube last year, and tens of thousands of Americans submitted examples of what they considered to be evidence. Elon Musk has?slammed?Twitter’s alleged “strong left wing bias.”
Whether they’re right or not, there’s no doubt that Facebook and other social media giants are intervening more and more in the content they publish, whether gun ownership second-Amendment posts or?information about how to access abortion pills?in a post Roe v. Wade world.
A Facebook friend who doesn’t seem insane regularly shares instances of where Facebook deletes or hides her content.
In many cases the reasons seem silly or arbitrary, like an AI that doesn’t really understand the content or get the joke. One shows a floating tent, captioned “Floating tent sleeps 4 and offers a cool new way to die while camping.” Other deletions seem more understandable, like the thumb with a face on it and a string tied around in a shape like a noose: it’s not explicitly about lynching, but it’s clearly intending to evoke that imagery. Poor taste, likely offensive, a bad joke, but is it censor-worthy?
Facebook also often just gets it wrong:
“My account has been restricted,” another friend recently?said. “Someone posted how cockroaches were under the benches in HB and I wrote ‘Burn them all down.’ I meant the bugs, but okay Facebook. Lol.”
But while there’s the mistaken and the comical, there’s also the Covid deniers and the anti-vaxxers and the election conspiracy theorists. Deciding at which point to censor or not seems agonizingly hard, if not impossible.
Elon Musk, whose deal to “save free speech” and hunt the bots on Twitter by buying the platform has fallen through thanks to — according to Musk — the bots on Twitter, had a different standard. As the legal wrangling around that terms of his extrication from his legal obligations begins, it’s worth considering that standard: the law.
That’s persuasive to a degree, but it also has risks. One of the?reasons Facebook implemented Covid misinformationpolicies is to save lives. As we can see in the recent Highland Park shooting and January 6 violence, misinformation about political realities can also cost lives. And that misinformation is created and spread far faster than any law could actually be codified and enforced. So it’s understandable that social media networks have felt it necessary to take action.
But the question is: does social media censorship feed extremism?
In other words, by banning things they consider false or dangerous, do the big social platforms actually make the social problem worse, perhaps like a gated community creating an island of privilege in an ocean of poverty?
Bill Ottman thinks so, in spite of the fact that he believes some unlawful content should be censored.
领英推荐
“What do you expect if you throw someone off a website, where do they go?” the Minds.com CEO asks. “Well, you just have to follow them and you see that they go to other smaller forums with less diversity of ideas, and their ideas get reinforced and they compound.”
That makes intuitive sense, of course.
People are inherently social, most of the time, and if they can’t speak their minds on Twitter or Facebook or YouTube, they’ll find Truth Social or Rumble or Gab or Gettr. Or a Telegram channel that can’t easily be censored, or any of?dozensof right-wing or conservative outlets ... or left wing, if that’s their persuasion.
The problem is that when they get there, they may just arrive in an echo chamber of ideas that lead them down the rabbit hole of more and more extremism.
“On Minds, we do have pretty strong diversity of thought,” Ottman says. “And so we are an alternative forum where people do go sometimes when they get banned. But I wouldn’t say their views are necessarily amplified when they come because we do have diversity of opinion.”
I believe that’s the goal, but I haven’t personally seen that on Minds, I have to say.
In trending tags around #humor, I see a meme about why Biden hasn’t been assassinated yet: “In case you wondered why someone shot Shinzo Abe but not Sleepy Joe ... Professionals have standards.” A recommended account has a meme about Trump Towers being the new Florida Guidestones offering suggestions about how to depopulate government, playing on the recent Georgia Guidestones monument destruction. And in my brief experience on the site, anything not pro-Trump is met with significant anger and invective.
But perhaps that just proves the point.
Keeping different, offensive, or even flat-out wrong people on platforms like Facebook and YouTube and Twitter might be a way to ensure that they at least occasionally see a glimpse of alternative reality bubbles, and offer us a chance to communicate. Especially if the algorithms that run social platforms are redesigned to not just show us more of what we like so we stay on the platform and earn more ad revenue for its owners but also show us different viewpoints.
Which runs the risk, of course, of making the platforms a living hell for those who don’t want to be confronted by extremist, nasty, or just ill-informed opinions all the time. (Anyone else significantly decrease their time on Facebook pre and post 2020 U.S. election?)
Davis thinks that might discomfort might be a worthwhile sacrifice ... if we can adjust our point of view on what offends us.
“I’m up the mindset that I cannot offend you. You can only allow yourself to be offended,” he says. “People say a lot of offensive things. And whether I want to be offended by it or not is up to me.”
Will permitting that offensiveness that we can try to not be offended by heal some of the divisions in society?
It might at least help reduce extremism, Davis thinks.
“I don’t think kicking people off of Twitter or Facebook, whatever, causes extremism. I think what it does is it causes them to perhaps follow a path that may lead to extremism. The extremism already exists, and they’re on different platforms and different areas. And, you know, when you get kicked off of something, you go somewhere else. And it’s quite possible that you might go in that direction to somewhere where it already exists, and it embraces you and welcomes you and amplifies you.”
Senior Partner at Worldpronet
1 年Hi John, It's very interesting! I will be happy to connect.
Your guide to the wonderful world of travel healthcare.
2 年I believe this to be true and can attest to seeing it. If we look at where people have moved to communicate after they removed from platforms like Facebook and Twitter, we see people moving to platforms and environments which are either closed off or near impossible to observe. How many fringe Discord servers are there that we do not know about ? I preferred knowing who the nut was and what he or she was saying on line.