Robodebt: What are you going to do when an algorithm wipes your account clean?
Norbert Biedrzycki
Microsoft | McKinsey | AI | FinTech | Web3 | Services | CEO | Board Member | Transformation | Change | YPOer
AI-based welfare systems are not flawless. While tech companies announce their cheerful visions of social change, the world’s poorest are being harmed by software errors.
I have previously written at length about the merits of artificial intelligence in business, environmental protection and science. I believe that the balance of our efforts in this increasingly digitized world continues to be positive, delivering a net benefit to humanity.
All-encompassing digitization carries both benefits and risks. Much is being said about technology having the potential to aggravate social divisions and set in motion processes that fly in the face of social justice. Many citizens of Australia, the UK and the Netherlands have learned it the hard way as they grappled with failing welfare systems whose full automation had unleashed a slew of defects. Therefore, I’d like to take a closer look at several aspects of impaired digital systems that can wreak havoc with people’s lives.
?
Welcome to a Kafkaesque dystopia
A flawed digitization of a welfare system may lead to existential dramas. Erroneous credit history calculations and debt assessments and trumped-up underpayments can ruin the lives of entire families. Attempts to negotiate, talk and get to the bottom of problems tend to fail as it is the machines that hold the key to the safe. Social workers are increasingly replaced by predictive analysis systems that assesses a person’s entitlement to benefits, and whether they are honest or have been cheating the state through alleged extortion. Many citizens seeking to contact the person responsible for suspending their welfare payments that are essential for their sustenance, run against a bot or a call center employee speaking to them from a different continent. Such systems are Kafkaesque as their algorithms operate in secrecy with the citizens never suspecting they are being scrutinized.
?
You are not entitled to social benefits
Recently, a fierce debate unfolded in Sweden on the automation of the social security system. The media reported the plight of thousands of unemployed people denied benefits due to system failure. The public officials who examined the algorithm found that 10 to 15 percent of benefit cancellation decisions had been wrong. The system designed to check citizen integrity had failed completely.
Anne Kaun and Julia Velkova , who are responsible for the digitization of welfare systems across Europe, have concluded that “Swedish public administration has begun to replace people with algorithms to decide on everything from welfare payments to child support and sickness benefits. Citizens do not know if their decision has been taken by an algorithm or an officer.”
?
The new category of Robodebt
In 2011, Australia adopted an algorithm that compared the amounts of income reported to welfare institutions by people in need with their income declared to tax authorities by their employers. Any discrepancies would be investigated by Centrelink employees, who were instructed to contact the concerned individuals when in doubt. Things changed in 2016 when the system was automated. The job of analyzing income declaration discrepancies was taken over by algorithms. On detecting irregularities, the automated system would accordingly notify the concerned person by e-mail. Many Australians who received such messages were appalled. They would log into the government website only to find they were allegedly in debt. As it turned out, the algorithms had detected discrepancies between their income declarations and tax assessments. The supposed debts were enormous , amounting to tens of thousands of dollars. While benefit recipients insisted they had broken no rules, the algorithm claimed otherwise. In time, the actual debt of such people turned out not to have been, say, $5,000, but closer to $50! Following a wave of protests, the parliament had the system shut down. Australian lawyer Philip Alston prepared a report for the United Nations on the extreme poverty of people excluded from society by erroneous decisions of digitized welfare systems. He shares the view of numerous people and institutions that recognize the growing severity and extent of the problem.
?
Fight for the right to privacy
Another example I want to share comes from the Netherlands. Dutch legal organizations and foundations dedicated to protecting civil rights have issued a damning opinion on the operation of their SyRI system. The system was actually an algorithm designed to analyze the data of people living in specific areas. It covered both urban districts and entire municipalities. Its purpose was to profile the people who could theoretically defraud the welfare system . What is significant is that selected groups of citizens were scrutinized in complete secrecy. This prevented them from defending themselves against being listed as suspects of alleged offenses. The lawyers’ opinions led the courts to rule that the system had unlimited capabilities of retrieving any data of any inhabitant and was illegal as such. This was an unambiguous reaction to an emerging system of social surveillance.
?
领英推荐
Algorithms cause poverty?
The problem has also been investigated for some time now by Humans Rights Watch, which analyzed the impact of running the automated Universal Credit system in the United Kingdom. “The government has put a flawed algorithm in charge of deciding how much money to give people to pay rent and feed their families. The government’s bid to automate the benefits system – no matter the human cost – is pushing people to the brink of poverty,” said Amos Toh, senior artificial intelligence and human rights researcher at Human Rights Watch. In June of this year, a UK court of appeals ordered the government to remedy the impact of the flawed algorithm on people receiving regular wages. There are many other examples of this kind. Welcome to a dystopia in which the protection of the state and citizens is becoming complete fiction.
?
Welcome to Kafka world
States have been pumping ever more funds into rollouts of technologies supposed to unblock outdated IT systems. They hope also that such tech will add value by improving relations between citizens and institutions. So much for theory. The practice is that many people who lack the most basic digital skills or who have been blacklisted by a malfunctioning system end up in trouble. Sadly, numerous politicians and lobbyists who try to persuade the authorities to adopt certain solutions for their alleged social benefits, use technology as their selling point. The benefits never come while the algorithms themselves become terrifying tools wielded by a heartless system.
So, what are our chances of protecting human autonomy and averting the devastating impacts of technology? I repeat: proper mechanisms to govern the use of modern technologies should be adopted on a timely basis and recognized as central to guiding technological development.
?
Works cited:
Algorithm Watch , Sweden: Rogue algorithm stops welfare payments for up to 70,000 unemployed, Automated decision-making has become a national talking point in Sweden, after a report by the country’s public broadcaster revealed thousands of unemployed people were wrongly denied benefits by a government computer run amok.?,?Link , 2021.?
Centrelink Belot, Henry (17 January 2017).?"Centrelink's controversial data matching program to target pensioners and disabled, Labor calls for suspension" .?ABC News. Retrieved?19 January?2017.
?Algorithm Watch , How Dutch activists got an invasive fraud detection algorithm banned. The Dutch government has been using SyRI, a secret algorithm, to detect possible social welfare fraud. Civil rights activists have taken the matter to court and managed to get public organizations to think about less repressive alternatives. Link , 2021.?
?Human Rights Watch , UK: Automated Benefits System Failing People in Need. Government’s Flawed ‘Universal Credit’ Algorithm Pushing People into Poverty. Link , 2021.?
?
Related articles:
?