Facebook’s moment of truth
Enrique Dans
Senior Advisor for Innovation and Digital Transformation at IE University. Changing education to change the world...
Donald Trump’s victory in the recent US elections is having an unexpected impact on Facebook, a company that has become, through its ambitious strategy, the largest media outlet in the world, unique in reaching almost 1.8 billion people around the world.
The company’s initial response to accusations of having played a key role in the victory of Donald Trump have satisfied nobody. It has attempted to shrug them off by saying it hosted relatively little false news and that it had minimal influence. This denial of reality also questions its own business. In fact, Zuckerberg’s position is so weak that he can’t even convince even his own employees, who have responded by setting up a task force to fight fake and sensational news, contrary to the founder’s opinion.
There is no denying that Facebook exercises an influence over its users: this was demonstrated in 2012 with its controversial and immoral mood-altering experiment, and all advertisers who spend money there know it. Even Donald Trump has made it clear not only that Facebook and Twitter were fundamental elements that helped to spread his message, but that Facebook campaigns became his main source of direct funding.
Sorry Mark, but you can’t shrug this one off. If you can boast about having changed the world during the Arab Spring, you cannot then deny your influence in this election.
The reality is that Facebook is now a key part of the machinery in election campaigns, a place where the use of false, biased and sensational so called news is not only habitual, but also encouraged. Blocking access to the advertising revenue of those pages dedicated to making and circulating this type of news, following Google’s lead, is an attempt to discourage this type of behavior, but has its consequences: what is true and what is false? Who decides? It is not the same thing to inform about scientific advances which can also be done in a sensationalist or tendentious way, than to tell people about politics. The danger in asking social networks and search engines to eliminate false and sensational news is precisely that: what will happen if they actually put themselves in that position, and what will be their criteria. In fact, if you look at the criteria of Snopes, undoubtedly the biggest experts in identifying false news and debunking myths on the internet, the problem is not so much the news itself, but the media.
In this sense, Google seems to be ahead of the game. Its concern about the effects of its algorithms and its impression that the excessive weight of social factors were encouraging sensationalism led the company to rethink many things and to initiate a whole line of development around the concept of Knowledge-Based Trust, or KBT, resulting in the most ambitious modification of its PageRank since its inception. Everything indicates that Facebook is still far from arriving at this type of reasoning: for the moment, its attempts to stop the controversy surrounding the manipulation of its trending topics were stopped by the fear this would offend its more conservative users.
Donald Trump, a man rooted intellectually in the nineteenth century, is a politician of the 21st century. His grandiloquent and exaggerated campaigning style that disregards truth and instead emphasizes sensationalism intended to be shared on the social networks is, in fact, responsible for the term post-truth becoming the Oxford English Dictionary’s word of the year.
He has been able to interpret the weaknesses of a nascent ecosystem like the social networks, full of contradictions and half-developed protocols, and has turned it into a very powerful electoral weapon. Millions of people have voluntarily or involuntarily indoctrinated themselves via pages full of false news that would not resist the most basic scrutiny, but which they have read in an environment, in an authentic gallery of mirrors that made them think that all those around them thought the same.
Facebook had its moment of truth, and failed the test. It chose to decide that its truth was that there was no truth, that each of us has our own, or that the truth was true by consensus. It wanted to become the world’s editor, but has shown itself to be an editor with an unclassifiable system of vague and amorphous values, which simply offers everybody what they want to hear and that reinforces already existing beliefs. In other words, our own personalized bubble, tailored to our friendships, our environment and our beliefs that some have exploited to great success. We now have four years to think about how we got here.
(En espa?ol, aquí)
Full Professor at Woxsen University
8 年Smart analysis
CRM Cloud & Integrations | Customer Digital Transformation Programs | PMP certified | AMP IE Business School
8 年Great critical analysis Enrique.