DATA CAN DISCRIMINATE BUT WE CAN AVOID THAT!

DATA CAN DISCRIMINATE BUT WE CAN AVOID THAT!

We live in the era of big data, and obviously we are not going to turn back.

 Big data means data management and possibly algorithmic bias, i.e. the application of an inappropriate or even immoral algorithm when using the data.

 The amount of information on the net and social networks is exponential. Every minute, 400 hours of content are uploaded by users to YouTube. It would take 323 days for one person to view all the content posted every minute on Facebook. In other words, this extensive production of content by users requires distribution rules solved in algorithmic form.

But which criteria are applied to the algorithm to allow it to distribute certain videos to Peter, specific advertising to Paul and another content to John? These are the criteria that FB, YouTube and the content distribution platforms have determined for you and me, sometimes arbitrarily.

 Try it yourself. Go onto your FB profile - settings and advertising preferences where your interests are listed. Mine are typical for a woman of my age - beauty, shopping and wellness (although these are subjects that I'm not very interested in, or at least for which I rarely consult posts on social media).

The algorithm will be based on these criteria defined beforehand randomly, according to the age/gender group, and target the advertising that will be sent to the addressee. But that's not all. The algorithm will also decide which friends you should interact with, identify the people who like your posts the most and the social interactions under your profile (haven't you noticed that you always end up interacting on FB with the same people without necessarily understanding why? And that suddenly, if you want to hear from one or another of your "friends", you have to go and get news by clicking on their profile?).

The first negative effect of the algorithm is that it locks you in a glass cage. While you think you have access to a vast amount of information, you actually find yourself trapped in a world in which the algorithm has arbitrarily decided to lock you up. You become a passive user again, fed by a continuous flow (just as you were by the flow of TV programs, or by the newspaper that deals with the news you should learn from every day).

 This is only the beginning... of what promises to be a real threat to the free will and privacy of users. The threat is called algorithmic bias. The criteria defined to allow data collection by the various service suppliers/ providers / social media are randomly defined. Now, how can they be used to predict the purchasing, consumption, social or even criminal behavior of the user? By accumulating huge amounts of data and cross-checking them, it is possible to predict what each person feels, wants and how each person will act, react and consume.

Algorithmic bias assumes a given behavior without necessarily being based on objective facts. It is the result of the training performed by the machine on the basis of the data entered. Through iterations, the machine deducts certain rules to solve the problem it faces. Take facial recognition software as an example. It will analyze all the data it receives: happy faces, sad faces, angry faces, psychopathic faces and so on. By analyzing each face, it will put you in the category to which you seem to belong. The machine therefore establishes a standard founded on data collected, on the basis of predefined criteria (supervised learning). What happens when the machine analyzes a situation that deviates from the norm? It bugs and generates an unmanaged situation that can lead to injustice.

That's where the danger is real. The algorithm can discriminate. In other words, algorithmic bias may have a negative social impact on a particular category of the population. Let us take predictive justice as an example. The algorithm may determine that defendants of foreign or ethnic origin are likely to commit more crimes and condemn them to heavier sentences given an alleged risk of repeat offence that does not exist. Even worse, facial recognition software could falsely identify individuals who are believed to belong to an ethnic group likely to commit more offenses than average. These people could find themselves wrongly accused of a particular crime. The algorithm can dictate who gets a loan, who gets hired, who gets laid off and who gets low-risk insurance coverage...

 The main issue is that we don't have the tools to question and verify the algorithm. Obviously, no digital platform will reveal its algorithms, at least for reasons of competition.

If this is the case, how can we verify that the algorithmic formula is based on objective data and not biased assumptions?

 How can we prevent our lives from being controlled and directed by the algorithm?

 It is essential, not only to impose strict obligations on those who develop an algorithm so that it does not impede or interfere with users' privacy, but also to allow users to be able to verify the data at any time that will lead to a decision dictated by the algorithm and that has a concrete impact on their lives. Otherwise, we will be dealing with a black box against which we will not even have recourse and against which there is no means of protection. And that is only the beginning of the nightmare that awaits us all…

 #algoritmicbias #datatransparency #datapolicy #usersright #datadiscrimination

 

 

 

 

要查看或添加评论,请登录

Leila Delarive的更多文章

  • La technologie au service de l'Humain

    La technologie au service de l'Humain

    Nous avons la capacité, en tant qu’individus ou en tant que petit rassemblement, de faire ce que seuls des…

    5 条评论
  • Ces données qui ont le pouvoir de discriminer

    Ces données qui ont le pouvoir de discriminer

    Nous vivons à l’ère du big data, et à l’évidence, nous n’allons pas faire marche arrière. Qui dit big data, dit gestion…

    6 条评论
  • Que font-ils que les autres ne font pas?

    Que font-ils que les autres ne font pas?

    Facebook dérange par sa suprématie. A l’heure actuelle, il n’y a absolument aucun média, aucun autre réseau social…

    1 条评论
  • The Empowerment Foundation - Polliniser les savoirs

    The Empowerment Foundation - Polliniser les savoirs

    Le site de l'Empowerment Foundation Etre libre, c’est tout un art. La liberté est une posture.

    9 条评论

社区洞察

其他会员也浏览了