AI Algorithms Objectify Women's Bodies
Pexels: Christina Morilla

AI Algorithms Objectify Women's Bodies

In February, a Guardian investigation revealed gender bias in AI algorithms used by social media platforms, resulting in the suppression of photos featuring women's bodies. The algorithms rate women's photos in everyday situations as more sexually suggestive and racy than men's, negatively impacting female-led businesses and perpetuating societal disparities.

The issue extends even to medical images, with breast examination photos labeled as explicit nudity or sexual content. This bias leads to the suppression of countless images featuring women's bodies, further amplifying societal disparities and raising concerns about the accuracy and objectivity of the algorithms used to analyze content.

Social media platforms, including Instagram and LinkedIn, leverage these biased AI algorithms to limit the reach of images deemed too racy. Shadowbanning, the practice of suppressing a post or account without the user's knowledge, has been linked to biased algorithms. Meta's oversight board has called for clearer guidelines to ensure equal treatment of all individuals after finding “Meta’s policies on adult nudity result in greater barriers to expression for women, trans, and gender non-binary people on its platforms. For example, they have a severe impact in contexts where women may traditionally go bare-chested, and people who identify as LGBTQI+ can be disproportionately affected.”?

The lack of transparency surrounding content moderation and shadowbanning exacerbates the harm caused by biased algorithms.? Photographers and content creators, such as Bec Wood, have experienced the suppression of their work, leading to devastating consequences for their livelihoods. Individuals may even be unaware that their posts or accounts are being suppressed, resulting in marginalization and a sense of invisibility.

“This is just wild,” said Leon Derczynski, a professor of computer science at the IT University of Copenhagen, who specializes in online harm. “Objectification of women seems deeply embedded in the system.”


Lack of Diversity in Training

Bias in AI algorithms can arise during their development, training, and decision-making processes. The data used to train algorithms can introduce biases that get replicated and reinforced over time. The association of women working out with raciness suggests that the labelers, possibly straight men, have subjective views on what constitutes racy content. UNESCO reports that only 12 percent of AI researchers are women, and they face significant obstacles, such as being 13 times less likely to file ICT patents than men. The underrepresentation of women in AI impacts the very technologies they help build.?

Biases can arise from data collection, algorithms, and the assumptions of software programmers. Increasing women's participation in STEM education and careers, alongside developing support structures and enforcing zero-tolerance policies for gender-based violence in the workplace, is essential. AI can also be used positively, such as through gender decoders in employment hiring processes, and adopting a human-centered AI approach that prioritizes user needs and incorporates fundamental rights impact assessments.

To ensure ethical AI, it is necessary to take an intersectional approach and consider factors like gender, race, ethnicity, and socioeconomic status. Stakeholders from various sectors, including businesses, tech companies, academia, UN entities, civil society organizations, and the media, should collaborate to develop joint solutions. The proposed Global Digital Compact, scheduled to be agreed upon in September 2024, presents an opportunity to address gender biases perpetuated by AI and establish normative frameworks for ethical AI governance.


Gendering AI Tools

AI tools are often gendered, reinforcing societal stereotypes and hierarchies. Virtual assistants like Amazon's Alexa, Microsoft's Cortana, and Apple's Siri were initially given default feminine voices and designed with submissive and helpful personalities. This feminization of AI perpetuates gendered divisions and can further reinforce occupational roles that align with preassigned gender roles. For example, robots in security-related jobs are often portrayed as "male," while robots in hospitality and customer service industries are frequently depicted as "female."

Feminized and domesticated forms of AI also perform affective labor traditionally associated with women, such as managing emotions and performing tasks like scheduling, reminding, and seeking information. However, the portrayal of virtual assistants as unaffected by stress or external factors perpetuates a fantasy that diverges from the lived realities of women. Moreover, the humanization and objectification of feminized AI can mirror gender-based violence and harassment, as evident from instances of verbal harassment directed at virtual assistants like Alexa.

Addressing gender bias in AI algorithms is critical for promoting diversity, fairness, and inclusivity. By eliminating explicit and implicit biases, AI can become a powerful tool for the common good, fostering gender equality and social progress. But until we make sure we are re-training models, we will find AI will exacerbate the very real bias that already exists in our data sets.

#AIAlgorithms #GenderBias #SocialMedia #ContentModeration #Shadowbanning #AlgorithmicBias #TrainingData #Transparency #Marginalization #DigitalInclusivity

Emily Capps

Super senior writer. Human in the mix.

1 年

Sigh. Not at all surprising.

Ed Wiley, PhD

Veteran CTO/CIO with >25 years experience in Engineering & AI leadership | Author of forthcoming (Q4 2024) book, "AI: From Buzzword to Business Function. Bringing Traditional and Generative AI to the Enterprise."

1 年

Thanks for the share, Sarah Woodward. The talk of "AI -> EXTINCTION" is pretty loud right now, but problems such as these are the issues that should be receiving the lion's share of our attention right now...

回复

要查看或添加评论,请登录

Sarah Woodward的更多文章

社区洞察

其他会员也浏览了