Tech Giants, Political Winds, and the Erosion of Digital Accountability
The podium crowd at the latest US presidential swearing-in ceremony does not bode well for keeping hate, vitriol, and fakes out of the digital information space.? The presence of tech bro-billionaires hint at one thing, acquiescence to the new mode of play. And taking stock of what has happened to X (formerly Twitter) since the take-over by the tech-bro-in-Chief, Elon Musk, serves as a harbinger for similar chaos likely to unfold on other platforms, particularly those owned by Meta. Even before the inauguration, Meta had announced plans to withdraw its support to fact-checking organisations and initiate major changes to moderation guidelines on its platform. The closest thing we can look to figure out what Facebook will turn into with these latest changes will be the metamorphosis of X since 2022. It was once my go-to platform for information, breaking news, and curated feeds. Not anymore. I access it probably twice a day, except when major news breaks. Then yes, I do stay on it because it is still the place to be for beat-by-beat breaking news—that is, if you can deal with the fakes, the hate speech, the marketing gimmicks, and the porn.?
What was once a curated, clean feed of my choice is now, every time I open it, mostly full of all the above and very little news. The reliance on community moderation and the relaxation of hate speech guidelines at Meta are likely to transform the feeds into something resembling X.? In the days following the announcement of the new operational frameworks, my Facebook feed featured friends and others experimenting with how far they could go using the name Zuckerberg in their content. Most of what they posted would get censored, even by this newspaper. What this experiment revealed was how far fakes, hate, and misogyny can now pervade on the platform.? This is not good news for countries like Sri Lanka and Myanmar, steeped in mass digital usage but with very low levels of digital hygiene and ground-level expertise at community levels.? We have seen in the past how Facebook and WhatsApp were used in both these countries to inflame racial tensions and coordinate riots and attacks. It feels like we have come full circle. Facebook is probably where it was pre-2018 in terms of moderation, when it was an incubator for hate speech and racially laced vitriol. Facebook has never shown a particular interest in dealing specifically with smaller markets. The organisation’s investment in fact-checking and partnerships with Sri Lankan entities was driven more by its desire to counter bad publicity and growing criticism in its home turf.? It is that same reasoning that has informed and prompted the latest volte-face – the changing political winds in the US. The Intercept remarked that the decision was “a shameless act of genuflection toward the incoming Trump administration.” The fact-checking organisations working with Facebook across the world had no idea what was in store until the news was announced on Fox. “There were no rumours, no news, no such thing, I just woke up to this news this morning,” a fact-checker working for an international organisation based in South Asia told me the morning after Facebook made the announcement.? What makes this reversal by Facebook even more striking is that it is not an isolated incident but part of a broader trend among tech companies trying really hard to appease the new administration.? Meta has not been shy to leverage its position as the most accessed social media platform. Here is one example. In late 2019, just as I was starting my postgraduate research, I received several messages from Facebook’s Asia office. They were trying to work with journalists in Sri Lanka. At the time, Facebook’s Journalism Project was becoming active in countries like Sri Lanka and Myanmar. A few years earlier, Facebook’s role in fuelling racial riots and spreading racial hate in both countries had been highlighted by journalists and researchers. The Facebook Journalism Initiative Asia office, based in Singapore, was working to counter-balance this negative publicity.? I shared details of several Sri Lankan colleagues and organisations and also reached out to them, encouraging them to seize the opportunity to hold Facebook accountable and question its track record in smaller, less financially lucrative markets like Sri Lanka. Unfortunately, this effort yielded no meaningful results. There was no in-depth reporting on the issue.?
领英推荐
This was the result of Facebook setting the ground rules of these engagements to suit its goals and local dynamics. Initially, attendees—including journalists and researchers—were instructed not to report on the content of these engagements. While these rules were later relaxed, they did not result in more thorough reporting. Most of Facebook’s local partners in Sri Lanka behaved like deer in headlights, blinded by the platform’s influence and brand value. Others were only interested in accessing Facebook’s monetisation plans. The lack of awareness among local journalists of Facebook’s impact made all this easier.? One minor pushback against this incoming torrent of hate, mis- and disinformation has been the emergence of robust fact-checking organisations over the past five years. They are capable of doing excellent work, but a big question mark hangs over whether their efforts can gain traction without Meta’s support.? A few days before Meta’s announcement, I had an extensive conversation with a fact-checker based in India. Her assessment was that fact-checking had become an essential part of journalism. “Because there is so much manipulation and fakes it is just impossible to counter the effect without actively and consistently pushing back and taking them down,” she explained. Given the propensity in Sri Lanka for gossipy and influencer-centric content to go viral, it would be prudent for journalism educators and experts to set up a structured framework to monitor the impact of Meta’s decision.? They should also consider actions that can promptly and effectively combat hate, mis and disinformation. Amantha Perera is a Research Network Member at Verité Research, and a Ph.D candidate at Creative, University of South Australia.
?