Facebook and neutrality: people versus algorithms
Enrique Dans
Senior Advisor for Innovation and Digital Transformation at IE University. Changing education to change the world...
Following on from the allegations made last week in a Gizmodo article about trending topics bias at Facebook, a subject I have discussed a couple of times recently, British daily The Guardian published a story citing leaked internal Facebook documents that purported to show that the company’s trending topics were subject to an editorial line decided not by algorithms, but by staff. The Guardian’s conclusion was that Facebook is just like any other traditional media in having an editorial line.
Facebook’s response has been to deny the accusation and to point out that the documents mentioned in The Guardian article relate to an earlier approach to deciding on trending topics, and that the process has been changed, while Mark Zuckerberg himself weighed in to deny editorializing and offering to correct any bias created by editors. Meanwhile, Justin Osofsky, VP for global operations provided a detailed account about how trending topics are generated, and the company issued an invitation to conservative politicians to discuss the matter.
The sequence of events illustrates the gravity of the situation: in four days, three statements, one of them by the company’s founder, innumerable articles in the media, and even a request from the US Congress for Facebook to appear and explain how it decides which news stories to distribute. After I appeared on Spanish television to discuss the issue, I was sent information by the subsidiary in this country.
Whether Facebook has an editorial line or not is undoubtedly relevant: it has some 1.6 billion users around the world increasingly use it as a source of information and news, rather than to share stuff with family and friends, so if it decides to try to influence how we vote or feel about things has major repercussions, even though in theory, the First Amendment of the US Constitution defends its right to do so.
It seems that we are prepared to accept bias from traditional media such as radio and television, but that we want our social networks to be a reflection of us, based on what we and the ones we follow decide to share, rather than the social network deciding for us.
Out of this debate emerges an interesting question as to whether any bias there might be is generated by people or algorithms. Conceptually, it’s pretty simple to create trending topics that give people information about things everybody is talking about: either by creating algorithms designed to find those things and publish them in a list, or by assigning editors to filter more carefully to avoid mistakes, duplicity or manipulation
But in practice, things are different: Twitter, which has proven to be far more algorithm-based than Facebook, must be vigilant at all times against accusations of manipulation, because trending topics stay in the system not based on how many times they have been mentioned, but by the increase of these mentions (a logarithmic value). Logarithms, while they might make perfect sense in this case, are not that easily understood by users. This generates a clear conflict: if people are talking a lot about a topic, but do so a more or less the same rate or diminishing over time, then it will tend to disappear from trending topics, even though users still see it as prominent. On the other hand, there are limits to what can be done without humans, and Twitter has been obliged to eliminate trending topics from its list when they contain terms or language that incite hatred, racism or violence.
In the case of Facebook, it has increasingly shifted to algorithms away from editors to select topics, suggesting that it is less prone to bias. We tend to assume that editors cannot be fully trusted not to employ bias: an editorial line does not have to be officially recognized or sanctioned by the company as such. On occasions, editorial line and bias can be rooted in a company or reflect a certain political climate created by influential groups or the founder, as well as in more subtle questions like hiring talent.
We all know that humans tend toward bias, even if we may not be aware of it. That said, I think it is dangerous to assume that trending topics are somehow free of bias simply because they have been selected by an algorithm supposedly free of sin. There is nothing magical about algorithms; they are simply a collection of rules decided by their programmers. In other words, it is meaningless to suggest that a company doesn’t editorialize because it selects its news stories via an algorithm, unless of course we are able to take that algorithm apart, which normally isn’t the case, precisely to avoid manipulation.
As algorithms increasingly come out of “black boxes” decided by casuistic analysis and earlier results processed through machine learning methodologies, we will have to accept that they are no necessarily impartial.
This is a fascinating subject for me, but for Facebook it’s a major problem that could damage its reputation and put a lot of people off using it. Which explains the flurry of statements and a willingness to clear up any doubts. If Facebook wants, like Caesar’s wife, not only to be above suspicion, but to appear so, then it has its work cut out.
(En espa?ol, aquí)
Instrument Technician at Colgate University
8 年Another reason not to bookface
Driver Operation Support at Lalamove Indonesia || Verifikator || Contact Center Bank Central Asia || Loand Collateral Insurance at Radana Bhaskara Finance
8 年me: junindah hutasoit come join us/
Kevin Brown - they will be niche 'community centric entities' that are secure, serve the purpose of their members - a devolved approach where people manage their own news, content, members. Already built this capability. Centalisation and monolithic is ultimately a dead end as it does not reflect or support the way we operate in the real world.
Founder I Strategic Advisor I AI 4 Good, Wellbeing Economy, SDGs, CSR, ESG, Sustainability, Impact Assessment
8 年When FB / Mark Zuckerberg says something, it always means the opposite. How long it will take people to understand this?