Left behind: how the images and narratives about AI exclude the rest of us

Left behind: how the images and narratives about AI exclude the rest of us

Published by BusinessDay: https://www.businesslive.co.za/bd/opinion/columnists/2023-02-01-johan-steyn-left-behind-how-the-images-and-narratives-about-ai-exclude-the-rest-of-us/

In the modern age, the narratives we learn to live by often come to us from outside of our cultural groups. In a techno-globalised world, and an era that some call post-truth, we are torpedoed with online images about technology, in particular, that shape the narrative of how we understand our place in the world alongside rapid advances such as artificial intelligence (AI).

I wrote a few weeks ago about OpenAI’s DALL-E text-to-image platform (“AI-generated images reflect its biases” , January 11). I concluded that platforms such as those, because of the inevitable cultural stereotypes and biases unwittingly programmed into them, “focused on producing images of Caucasian men, especially as it relates to concepts around intelligence, leadership and capability”.

The authors of a brilliant and timely book on the subject, AI Narratives: A History of Imaginative Thinking about Intelligent Machines, aim to answer the question of how visual and cinematic representations of AI technologies might reflect and contribute to how people understand and relate to the technology. “Narratives of intelligent machines matter because they form the backdrop against which AI systems are being developed, and against which these developments are interpreted and assessed.”

When we do an internet search on AI, or related terms such as automation, robot, android or cyborg, we are very likely to find images that largely reflect a specific cultural stereotype, that intelligence is associated with “whiteness” and men.

This became evident to me recently when I was helping my eight-year-old son with a school project on the topic of whether robots are our friends or not. I am a Caucasian man and my son, who is adopted, is of African descent.

We were looking to find images of robots through a Google search when he asked me a surprisingly interesting question. “Daddy, why do all the robots look like you and not like me?”

The authors of the article, “The Whiteness of AI” say that “stock images of AI, at least when anthropomorphised, are overwhelmingly white … The more realistically humanoid these machines become, the more Caucasian in their features.”

Intelligent technology has been portrayed cinematically in films such as The Terminator, RoboCop, Blade Runner, I Robot and others as humanoid androids who represent Caucasian males.

“We argue that this Whiteness both illuminates particularities of what (Anglophone Western) society hopes for and fears from these machines and situates these effects within long-standing ideological structures that relate race and technology.”

The authors write about decolonising AI, “a process of breaking down the systems of oppression that arose with colonialism and have led to present injustices that AI threatens to perpetuate and exacerbate”.

For AI technologies to be representative of all people, to be democratised, our narratives need to be re-examined and changed. The influence of minority groups, different ethnicities and gender representation is important, especially as we endeavour to minimise biases in the data sets we use.

AI is the most powerful technology we have ever created. Some even call it “our final invention n”. It is imperative that our thinking about it encourages a vast variety of people from across the world to work at shaping how it will affect our children. No one should be left behind.

? Steyn is on the faculty at Woxsen University, a research fellow at Stellenbosch University and founder of AIforBusiness.net

要查看或添加评论,请登录

社区洞察

其他会员也浏览了