"Algorithmic Love" Reveals the Dark Side of Tinder - (Future of AI Recruitment in question?)
Michael Mandic
Recruiter for Private Banking & Wealth Management | Founding Partner @ Anderson Wise l Executive Search Consultant | Head-Hunting RMs and C-Level Executives since 2011
This is an article written on "Le Temps" and the reason why I am sharing it is simple - I believe that most of the AI Apps for recruitment in the near future would have the same flaws and play in favour of a selected few, gender bias or similar...
Read it and draw your own conclusions:
"Algorithmic Love" Reveals the Dark Side of Tinder
Do you believe in the chance of the meeting, even on Tinder? Think again, algorithms dominate your relationships right into your bed. For a year, journalist Judith Duportail led the investigation
Eight hundred pages. That's what Judith Duportail received when she asked Tinder to provide her data. All the jokes copied and pasted, all the conversations, all the men "matched" since his inscription on the application is under his eyes.
The French journalist seeks to know her own "note of desirability" after discovering its existence in the press. This is based on the level of income, education, success on the application, "intelligence" supposed. Information directly derived from the data shared by the users themselves and which allow Tinder, let's say it frankly, to connect the beautiful with the beautiful, the "desirable with the desirable". While continuing to make us believe that only chance and geographical proximity preside over our meetings.
Read also: Bumble and Tinder, rivals to the courts
She first recounts her amazement in an article for the Guardian, one of the most read of 2017, and decides to make a book, "Love under algorithm", able to deny any user who still believes at random. We discover - with amazement, often - a lot of details about the technology of the application, especially about its sexism.
Time: Why is it so difficult for Tinder to admit the existence and functioning of this "desirability score", through which they classify users?
Judith Duportail: They always answer that it's their business secret. I like to compare this situation with that of Coca-Cola: the recipe is secret, but someone - a competent authority - was able to verify it and say that this drink was fit for consumption. Why do we trust Tinder on parole while playing with neuroscience, our dignity, equality? Whom we have the right to meet, to love, to touch: it is the basis of our freedom.
Are users sufficiently aware of how dating applications work?
There is an asymmetry of hallucinatory information. It should be understood that Tinder relies on the economy of addiction, the principle of random reward, which acts as hard as cocaine and is inspired by slot machines. As in a magic show, if you know the "stuff", you will be less fooled. Everything I discovered during this survey, we have no idea when we connect, while it is an extremely successful system, able for example to know how many syllables you use per word, how many words per sentence, and who analyzes your writing to evaluate your IQ. We realize that it is possible for these results to be used differently by gender, but when we ask Tinder, we only receive lawyer's letters. It is urgent that this is known.
The Tinder system is able to know how many syllables you use per word, how many words per sentence, and analyze your writing to evaluate your IQ.
What was the most troubling moment of your investigation?
Read the Tinder patent. This is not exactly the technical reality, but its reading allows to approach the state of mind of the company. There is the example of Harry and Sally, which I tell in my book: if Harry has a good job and earns a lot of money, he gets a bonus on "his desirability score". , while Sally, in the same situation, gets a penalty. If this sexist rule is really applied, it raises great ethical questions! The other important moment was when I understood the influence of the algorithm on my life: on Tinder, I'm like in a party where I do not even have the right to see men too rich, too beautiful or too young "for me" because the algorithm discards them immediately.
Perhaps, in the future, there will be lawsuits brought to the inventors of algorithms to come to account for the consequences of their inventions.
You integrate very personal stories about not always glorious feelings. It was an obvious choice?
I wanted to unveil intimate sequences, show what really happens in users, what applications trigger in us very dark. I met Bérénice during my investigation because she is the archetype of the great winner of Tinder: she is beautiful, she knows all the codes, she is young and rich and yet, even she develops an addiction. She pretends to be a prostitute to know how much she is worth, she has a slave ... Tinder presses our weaknesses.
This is the reason that pushed you to not limit yourself to the journalistic investigation?
He would have missed the essential: what it really does to us, from the inside. I wanted to have an uncompromising look and the same degree of transparency towards all actors, including myself. On social networks, we spend so much time looking for the right legend, the right filter, to build an ideal Judith ... We are suffering, our generation. I wanted to do the opposite, be as sincere as possible, and say, "Look, I'm completely crazy." It's also a call: "Are you as lost as me?"
About addiction, what do you think of the recommendations to reduce the attractiveness of applications, such as switching your phone to black and white or turning off notifications?
I find it interesting and positive, but it seems to be problematic to put on the user the responsibility of the manufacturer. It would be like saying that it's your fault if you smoke, while the cigarette is addictive. But the big question is at the business level: why is it so difficult to have answers? Is the functionality "sense of destiny", which appears in the patent [editor's note: who seeks to artificially give "signs of fate" to users, as having the same date of birth] was put into operation or not? And the sexist classifications? It would be more useful to have transparency and answers than to act individually.
Do new technologies necessarily reinforce social biases?
The first researcher to discover Tinder's patent, Swiss Jessica Pidoux, says it is the result of patriarchal logic. The algorithms are not neutral, they are opinions transformed into code. Since society is predominantly sexist, if they reflect society, we will never be able to change the world. But one can also imagine that companies, including Tinder, make the choice to do otherwise. It's an almost philosophical question. Perhaps, in the future, there will be lawsuits brought to the inventors of algorithms to come to account for the consequences of their inventions.
Do you think Tinder will still be here in five years?
The projections of sociologists show that it is a tool that will take more and more importance, hence the challenge of knowing how it is built.
#AI #automation #apps #recruitment #wealthmanagement #privatebanking
Banking | Private Credit
5 年Good idea Michael Mandic! ...IT would sort our the gender issue... at least some more women would qualify for joining the financial industry ??