Facebook and neutrality: people versus algorithms

Facebook and neutrality: people versus algorithms

Following on from the allegations made last week in a Gizmodo article about trending topics bias at Facebook, a subject I have discussed a couple of times recently, British daily The Guardian published a story citing leaked internal Facebook documents that purported to show that the company’s trending topics were subject to an editorial line decided not by algorithms, but by staff. The Guardian’s conclusion was that Facebook is just like any other traditional media in having an editorial line.

Facebook’s response has been to deny the accusation and to point out that the documents mentioned in The Guardian article relate to an earlier approach to deciding on trending topics, and that the process has been changed, while Mark Zuckerberg himself weighed in to deny editorializing and offering to correct any bias created by editors. Meanwhile, Justin Osofsky, VP for global operations provided a detailed account about how trending topics are generated, and the company issued an invitation to conservative politicians to discuss the matter.

The sequence of events illustrates the gravity of the situation: in four days, three statements, one of them by the company’s founder, innumerable articles in the media, and even a request from the US Congress for Facebook to appear and explain how it decides which news stories to distribute. After I appeared on Spanish television to discuss the issue, I was sent information by the subsidiary in this country.

Whether Facebook has an editorial line or not is undoubtedly relevant: it has some 1.6 billion users around the world increasingly use it as a source of information and news, rather than to share stuff with family and friends, so if it decides to try to influence how we vote or feel about things has major repercussions, even though in theory, the First Amendment of the US Constitution defends its right to do so.

It seems that we are prepared to accept bias from traditional media such as radio and television, but that we want our social networks to be a reflection of us, based on what we and the ones we follow decide to share, rather than the social network deciding for us.

Out of this debate emerges an interesting question as to whether any bias there might be is generated by people or algorithms. Conceptually, it’s pretty simple to create trending topics that give people information about things everybody is talking about: either by creating algorithms designed to find those things and publish them in a list, or by assigning editors to filter more carefully to avoid mistakes, duplicity or manipulation

But in practice, things are different: Twitter, which has proven to be far more algorithm-based than Facebook, must be vigilant at all times against accusations of manipulation, because trending topics stay in the system not based on how many times they have been mentioned, but by the increase of these mentions (a logarithmic value). Logarithms, while they might make perfect sense in this case, are not that easily understood by users. This generates a clear conflict: if people are talking a lot about a topic, but do so a more or less the same rate or diminishing over time, then it will tend to disappear from trending topics, even though users still see it as prominent. On the other hand, there are limits to what can be done without humans, and Twitter has been obliged to eliminate trending topics from its list when they contain terms or language that incite hatred, racism or violence.

In the case of Facebook, it has increasingly shifted to algorithms away from editors to select topics, suggesting that it is less prone to bias. We tend to assume that editors cannot be fully trusted not to employ bias: an editorial line does not have to be officially recognized or sanctioned by the company as such. On occasions, editorial line and bias can be rooted in a company or reflect a certain political climate created by influential groups or the founder, as well as in more subtle questions like hiring talent.

We all know that humans tend toward bias, even if we may not be aware of it. That said, I think it is dangerous to assume that trending topics are somehow free of bias simply because they have been selected by an algorithm supposedly free of sin. There is nothing magical about algorithms; they are simply a collection of rules decided by their programmers. In other words, it is meaningless to suggest that a company doesn’t editorialize because it selects its news stories via an algorithm, unless of course we are able to take that algorithm apart, which normally isn’t the case, precisely to avoid manipulation.

As algorithms increasingly come out of “black boxes” decided by casuistic analysis and earlier results processed through machine learning methodologies, we will have to accept that they are no necessarily impartial.

This is a fascinating subject for me, but for Facebook it’s a major problem that could damage its reputation and put a lot of people off using it. Which explains the flurry of statements and a willingness to clear up any doubts. If Facebook wants, like Caesar’s wife, not only to be above suspicion, but to appear so, then it has its work cut out.


(En espa?ol, aquí)

John Robinson

Instrument Technician at Colgate University

8 年

Another reason not to bookface

Junindah Hutasoit

Driver Operation Support at Lalamove Indonesia || Verifikator || Contact Center Bank Central Asia || Loand Collateral Insurance at Radana Bhaskara Finance

8 年

me: junindah hutasoit come join us/

Kevin Brown - they will be niche 'community centric entities' that are secure, serve the purpose of their members - a devolved approach where people manage their own news, content, members. Already built this capability. Centalisation and monolithic is ultimately a dead end as it does not reflect or support the way we operate in the real world.

Jordan Panayotov

Founder I Strategic Advisor I AI 4 Good, Wellbeing Economy, SDGs, CSR, ESG, Sustainability, Impact Assessment

8 年

When FB / Mark Zuckerberg says something, it always means the opposite. How long it will take people to understand this?

要查看或添加评论,请登录

Enrique Dans的更多文章

  • El desastre del software y la automoción

    El desastre del software y la automoción

    GM se ve obligada a detener temporalmente las ventas de su Chevy Blazer EV después de detectar un sinnúmero de…

    11 条评论
  • El enésimo drama de la automoción tradicional: la interfaz

    El enésimo drama de la automoción tradicional: la interfaz

    Porsche acaba de anunciar que se une a toda la legión de empresas de automoción tradicionales y renuncia a tener una…

  • Poniendo a prueba a ChatGPT: consultores centauros o cyborgs

    Poniendo a prueba a ChatGPT: consultores centauros o cyborgs

    Un working paper de Harvard, ?Navigating the jagged technological frontier: field experimental evidence of the effects…

    12 条评论
  • Suscripciones, tramos… y spam

    Suscripciones, tramos… y spam

    Elon Musk confirma sus intenciones de convertir la antigua Twitter, ahora X, en un complejo entramado de suscripciones…

  • El código abierto y sus límites

    El código abierto y sus límites

    Sin duda, el código abierto es la forma más ventajosa de crear software: cuando un proyecto de software toma la forma…

  • La gran expansión china

    La gran expansión china

    El ranking de apps más descargadas en el mundo en iOS y Android para el mes de septiembre de 2023 elaborado por…

    1 条评论
  • Starlink y las torres de telefonía en el espacio

    Starlink y las torres de telefonía en el espacio

    Starlink remodela su página web y a?ade una oferta de internet, voz y datos para smartphones provistos de conectividad…

    3 条评论
  • La fotografía con trampa

    La fotografía con trampa

    La presentación de los nuevos smartphones de Google, Pixel 8 y Pixel 8 Pro, y fundamentalmente de las funcionalidades…

  • Las consecuencias de reprimir los procesos de innovación

    Las consecuencias de reprimir los procesos de innovación

    Mi columna de esta semana en Invertia se titula ?El mercado de trabajo y la innovación? (pdf), y previene sobre los…

  • We are on the verge of the most dangerous election in history

    We are on the verge of the most dangerous election in history

    In just a few days, on November 3rd, the US presidential elections will take place, the most dangerous in history, and…

    2 条评论

社区洞察

其他会员也浏览了