Data Colonialism in Healthcare
Bart De Witte
CEO & Co-Founder at Isaree (in Stealth) | Founder HIPPO AI Foundation | Keynote Speaker, Lecturer | Digital Health | Medical AI | Open Source
Google and it's effort to use AI for good is ethical white-washing
The most controversial issue that is being discussed is the current and future convergence of big health data - combined with artificial intelligence (AI). Most of the AI proponents, such as myself, call it the most transformational technology, that offers unprecedented opportunities for public health. The application of AI in medicine has the potential to reduce the barriers to accessing diagnostics and care, making it available to all. In my call to action last year, and with the Hippo AI foundation, I focus on the opportunity to build AI as a global public good. If we don’t, we risk to become the dumbest generation since the invention to Gutenberg's printing press, and I will give you some examples why.
Beginning this year, Google announced that they are rolling out their AI based medical diagnostic services at the Aravind Eye Hospital. The algorithm makes it quicker and easier for doctors to detect the risk of a diabetic retinopathy. The Aravind Eye Hospital will treat anyone who comes through the door, with or without money. As nearly 70 million Indians are diabetic, and all are at risk of blindness, this first might sound like, Google is really using AI for Good. The collection of scans, and the further development of algorithms that detect anomalies out of the retina scans, can even help them to detect the risk on heart diseases, without having to perform a blood test.
I counter the AI for social good view, and call it a case of ethical white-washing. We as a society do not consider the long term risks, of interdependencies and the privatisation of all future medical knowledge. I am surprised that the United Nations are even supporting these activities und their Sustainable Development Goals, as from my point of view, on a long term there is nothing sustainable about this development. The continuous collection of data from human beings by private corporations would have been indisputable 20 years ago. It is not natural, let alone rational; and so rejects the idea that the results of data processing are a naturally occurring form of social knowledge, rather than a commercially motivated form of extraction that advances particular economic and governance interests.
Data Colonialism
Prof. Nick Couldry a Professor of Media, Communications and Social Theory at the London School of Economics, wrote an excellent piece on data colonialism last year and published a book called the cost of connection. One of his statements what struck me most:
Just as historical colonialism paved the way for industrial capitalism, data colonialism is paving the way for a new stage of capitalism whose outlines we only partly see: the capitalization of life without limit. There will be no part of human life, no layer of experience, that is not extractable for economic value. Human life will be there for mining by corporations without reserve as governments look on appreciatively. This process of capitalization will be the foundation for a highly unequal new social arrangement, a social order that is deeply incompatible with human freedom and autonomy.
Nathaniel Raymond an American human rights investigator, a lecturer at the Jackson Institute for Global Affairs, and former director of the Signal Program on Human Security and Technology of the Harvard Humanitarian Initiative, formulated his view during the AI now 2017 conference as followed "We are engaging in a non-consensual human experimentation, on the most vulnerable people on the worst day of their life" - his definition of colonialism seems to fit perfectly to the Google Verily case in India. "The imposition of unequal, often extra-legal, power relationships on one group by another".
Of course we should not reject data collection and use in all its forms, but we should stop celebrating so called "AI for the social good" and ethical white-washing schemes used by larger corporates. We should reject the current form of resource appropriation and accompanying social order that most contemporary data practice represents.
As we are currently building the foundation of our future healthcare systems, we should focus using AI for the social good, without increasing further inequalities on health and length of life, as mentioned during the TEDx Talk last June in Berlin.
FIAP, FBCS, Professor, Director
4 年Big data is a tale invented by fat cats to sell their equipment. Those who are promoting this are most likely prepaid or used. All we need is knowledge and truth, i have not seen router and data clouds inside human heads. Besides... the truth is written by letters, not numbers.
CEO & Co-Founder at Isaree (in Stealth) | Founder HIPPO AI Foundation | Keynote Speaker, Lecturer | Digital Health | Medical AI | Open Source
5 年sure -?Babajide Owoyele
AI for Finance & Biotech
5 年That's cool, I also wrote about data colonialism, although I am more optimistic about data resistance https://medium.com/hackernoon/how-ai-startups-must-compete-with-google-reply-to-fei-fei-li-35dda19c8a3f
Helping organizations successfully navigate their information technology initiatives
5 年Another perspective on the whitewashing topic is the embrace Open Source as if that implied Open Knowledge by itself. Logic and data are two sides to the same coin that comprise our digital knowledge infrastructure. It is mistaken to think that advocating or contributing to Open Source alone will move the needle of permissionless innovation so badly needed in Healthcare. ?