Did social media echo chambers lead to the US Capitol riot?
Win McNamee/Getty Images

Did social media echo chambers lead to the US Capitol riot?

With everything that happened in the US Capitol on 6 January, I immediately began to wonder how much social media echo chambers may have contributed to the riot occurring. It comes as no surprise that the riots were organised on online platforms and social media (I mean, they were organised enough to make MAGA Civil War shirts). But what led these people to become so caught up on what they read and engaged with online that they would even consider storming the Capitol building?

Last year, as part of my Master of Digital Communications, I completed the subject "Debates in Digital Culture". My final assignment was on whether echo chambers and filter bubbles exist on social media; my gut reaction being, 'well of course they do'. I was then surprised that the majority of the academic readings provided denied the impact of such echo chambers, largely putting it down to media hype around social media. However, the more I read, the more I found clear patterns in the research that the people we should be most worried about, those on the political margins, are those most likely to fall prey to these echo chambers and filter bubbles.

What are echo chambers and filter bubbles?

Echo chambers are defined by Bruns (2019) as groups that consciously choose to connect, while excluding outsiders. In contrast, filter bubbles have been defined by Pariser (2011) as the result of internet filters tracking people’s online activity to create a “unique universe of information... which fundamentally alters the way we encounter ideas and information”. Similarly, Sumpter (2018) states that the difference between the two lies in whether they are created by algorithms or by people. It's important to define the two as the research became shows that both the role of human agency and algorithms, impact on the formation of people’s views.

While there's a lot to explore about filter bubbles as a result of algorithms - this article is focussed on the human agency side of this topic - echo chambers.

What does the research say about their existence?

The consensus with researchers is that echo chambers may exist, but that their effect on society are often overstated by the media; the research largely discounts them. Much of the literature finds that they only affect a small percentage of the population (Bruns, 2019; Dubois and Blank, 2017; Sumpter, 2018 and Kitchens et al, 2020).

What I found important is that the research consistently references polarised or marginalised people on the fringes, or groups on the extremes of the ideological spectrum, being the most likely to be affected. In 2017, Hull noted that social media offers an additional layer to the existing real-world echo chambers in society, saying that social media offers a rich environment for those who want to increase polarisation. Mark Zuckerberg himself points out that the real bias of social networks is toward the extremes. 

Some research found that they may act as source of polarisation and confirmation bias and that the concern is they may widen the gap between those who are politically informed and engaged and those who are not (Dubois et. al, 2020).

Therein lies the problem.

On the media people choose (or choose to ignore)

It’s well established that biased media sources such as news and television can sway opinion and views (Epstein and Robertson, 2015) and that people select media that relates to their political and partisan preferences (Dubois and Blank, 2017). This can be seen with conservatives choosing to engage with Fox News, as opposed to liberals choosing Huffington Post. 

There are also those unlikely to trust mainstream media altogether and more likely to believe ‘fake news’. Dubois et al (2020) name opinion-avoiders (people who avoid media coverage and communication on a given topic) and opinion-seekers (people who place importance on their peer’s views) as the most likely to fall into this category. Sumpter (2018) argues these people are also often undecided voters; the exact people who likely to decide election outcomes. Further, marginalised groups are likely to reject mainstream media in favour of hyperpartisan content. An example is the ‘Alternative For Germany’ (AFD) who clearly inhabit echo chambers in this way (Bruns, 2019). 

The concern here then is not just those who use biased media, but those who are disconnected from the traditional news. If people in online echo chambers are reinforcing their views with biased media, ignoring traditional news altogether, or like 2% of the US population purely using social media as a source of news, it challenges the researcher's theories that exposure to a wide range of media results in echo chamber’s non-existence. 

Is there hope in fact-checking?

Dubois et al (2020) note that opinion-seekers and opinion-avoiders are also unlikely to take part in fact-checking. Further, Bruns (2019) points out that fact-checking initiatives are often ignored by polarised groups and notes that when different media is consumed it may be done with the goal of “inoculating [themselves] against its rhetoric”. Some echo chamber opinion-leaders recommend this; learn ‘the other side’s’ arguments in order to rebut them. An example of this is anti-feminist commentator Daisy Cousens (2018, 1:48) Five Point Plan for Thrashing Leftists in any Debate Situation who tells followers to “arm yourself with their facts as well as your own... read everything, watch everything”. Indeed, in 2010 O’Hara found that 34% of voters read opposing views to simply reinforce their own viewpoints. 

Integrated marketing for the far-right

Social media usage is dynamic, with people changing or migrating from one service to another (Serrano, 2020). Polarised groups, such as the far-right, are not restricted to a single platform but generate content across a wider range of technologies for their purposes; much like an integrated marketing campaign (Weimann & Masri, 2020). If someone is in a far-right echo chamber on one platform, they can easily migrate into echo chambers on other platforms; seeking out reinforcing content from a range of social sources, on top of the other media they consume. Arguably, if they are consuming the same rhetoric across a range of platforms this will strengthen their confirmation bias.

Mute, ignore, unfriend or block - dealing with opposing views on social media

Some research focuses on how people’s social connections point to the non-existence of echo chambers. However, Bruns (2019) state those in the fringes still hear but are increasingly less willing to listen to opposing views. A 2016 Pew survey supported this, showing that 79% of people have never changed their views on a social or political issue because of something they saw on social media. 

Sumpter (2018) found that the more opposing comments that a conspiracy theorist encounters, the more likely they are to continue to share, comment and argue over them; in fact attacks from the outside only act to reinforce the bubble, with anti-conspirators becoming proof that the conspiracy exists. Serrano’s (2020) research on TikTok shows that cross-partisan interactions do happen but tend to have a polarizing effect, to the point of users ‘dueting’ other’s content to fight against them. 

Some people take it further than simply ignoring opposing views; the 2016 Pew survey found that when ignoring content has failed, 39% of social media users have either muted or unfriended and blocked people with differing political views. In all likelihood, people are attempting to create their own echo chambers by removing these connections. 

When social media platforms take a stand

With people choosing biased media and attempting to limit exposure to opposing views on social media, what happens when social platforms make moves to decrease echo chambers?

Since 6 January we've seen:

Will these measures help with the echo chambers occurring? Or could they potentially increase polarisation?

Looking to what Reddit did in 2018 may answer that question; they introduced a quarantine to cordon off problematic subreddits, reduce access to hateful material and create a more positive direction for the platform (Copland, 2020).

There was a 50% drop in engagement in the subreddits, which on the surface seems like a success, however, they drove polarised users further into echo chambers, such as their own self-created ‘Red Pill’ focussed platforms, Gab, Parler and Voat. While the echo chamber may have decreased on one platform, they have just strengthened elsewhere, where their content is not challenged. 

Where does this leave us now? With Parler going offline, other "free speech" platforms like Gab and MeWe are seeing a surge in new users. Are we simply pushing people further to the margins and increasing their distrust in mainstream media and opinion? Perhaps.

Where to from here?

While I agree the majority of people are not 100% stuck in echo chambers without exposure to opposing viewpoints, their existence and impact should not be discounted. Human agency and algorithms (which I could write an entire article on!) are likely to be responsible for the creation of echo chambers and filter bubbles, with those on the fringes of society are most susceptible to them, and increasingly polarised through them.

While on the surface exposure to a wide range of media and connections may appear to reduce their impact, but no one study can show us the full effect of people using these for further confirmation bias. However, it can be agreed that, as stated by Kitchens et al (2020), the tendencies to interact with like-minded people and biased media and platforms may encourage the adoption of more extreme ideological positions.

While they are marginal, the impact on people on the fringes should not be underestimated; the effects on their formation of societal and political views may have wider implications for society in an increasingly polarised world. Margins matter; undecided voters who are susceptible to echo chambers could make or break an election... or become so extreme that they storm the United States Capitol. We should not simply be discounting echo chambers and filter bubbles as unimportant; there need to be wider discussions about their effect on polarised people and how to get through to these groups, rather than push them further to the margins.

What do you think? Have you witnessed echo chambers on social media? What steps have you taken to break out of your own?

References

Bruns, A. (2019). Are Filter Bubbles Real?. Polity Press.

Copland, S. (2020). Reddit quarantined: can changing platform affordances reduce hateful material online? Internet Policy Review, 9 (4). https://dx.doi.org/10.14763/2020.4.1516 

Cousens, D. (2018, April 18). Daisy's Five Point Plan for Thrashing Leftists [Video]. Facebook. https://www.facebook.com/watch/?v=2006202419639870.

Dubois, E., & Blank, G. (2018). The echo chamber is overstated: the moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729-745. https://dx.doi.org/10.1080/1369118x.2018.1428656 

Dubois, E., Minaeian, S., Paquet-Labelle, A., and Beaudry, S. (2020). Who to Trust on Social Media: How Opinion Leaders and Seekers Avoid Disinformation and Echo Chambers. Sage Journals, 6(2). https://doi.org/10.1177/2056305120913993.

Duggan, M. & Smith, A. (2016). The political environment on social media. Pew Research Center. https://assets.pewresearch.org/wpcontent/uploads/sites/14/2016/10/24160747/PI_2016.10.25_Politics-and-Social-Media_FINAL.pdf

Eng, J., Dong, M., Schaul, K. and Fischer-Baum, R. (2020). How turnout and swing voters could get Trump or Biden to 270. The Washington Post. https://www.washingtonpost.com/graphics/2020/politics/voter-turnout-270-trump-biden/.

Kitchens, B., Johnson, S., & Gray, P. (2020). Understanding Echo Chambers and Filter Bubbles: The Impact of Social Media on Diversification and Partisan Shifts in News Consumption [Unpublished manuscript]. MIS Quarterly, 44 (4), https://doi.org/10.25300/MISQ/2020/16371 

Hull, G. (2017). Why social media may not be so good for democracy. The Conversation. https://theconversation.com/why-social-media-may-not-be-so-good-for-democracy-86285 

O'Hara, K. (2014). In Worship of an Echo. IEEE Internet Computing, 18(4), 79-83. https://doi.org/10.1109/mic.2014.71 

Pariser, E. (2011). The Filter Bubble: What the Internet is Hiding from You. Penguin.

Serrano, J., Papakyriakopoulos, O., & Hegelich, S. (2020). Dancing to the Partisan Beat: A First

Analysis of Political Communication on TikTok. 12Th ACM Conference On Web Science, 257–266. https://doi.org/10.1145/3394231.3397916

Sumpter, D. (2018). Outnumbered : From Facebook and Google to Fake News and Filter-Bubbles - the Algorithms That Control Our Lives. Bloomsbury Publishing Plc. 

Weimann, G., & Masri, N. (2020). Research Note: Spreading Hate on TikTok. Studies In Conflict & Terrorism, 1-14. https://doi.org/10.1080/1057610x.2020.1780027




 



?



要查看或添加评论,请登录

社区洞察

其他会员也浏览了