The Cost of Bias: Another Personal Encounter with AI-Generated Image Tools

The Cost of Bias: Another Personal Encounter with AI-Generated Image Tools

If you had told me I would pay for an AI-generated image/photo enhancer and receive such subpar outputs, I would have never believed you. Like many, I assumed there was a correlation between a paid service and quality. But as I learned at Stanford University Graduate School of Business LEAD: “Correlation is not causation.”

This morning, as part of my work on digital colorism (digitalcolorism.com), I tested Remini AI, and I am still struggling to find the right words. Frustration and distress barely scratch the surface.

These are the inputs that I always use to compare those AI-generated image tools:

These are the outputs I received from Remini AI:


For months, I’ve been advocating against the social, societal, and cultural risks of identity erasure posed by biased AI tools—but this experience was painfully personal.

Let me break it down:

???? Psychological risks: AI outputs like these can perpetuate body dysmorphia, diminish self-esteem, and alienate individuals from their own sense of identity.

???? Physiological risks: By promoting unattainable beauty standards, tools like these can encourage harmful body alteration practices such as skin bleaching or extreme cosmetic surgery.

???? Social risks: AI tools often reinforce implicit biases, associating whiteness with beauty, success, and professionalism—further marginalizing people of color.

???? Societal risks: These outputs contribute to the homogenization of beauty, erasing diversity and individuality. Over time, this dehumanization undermines the very fabric of inclusion.

???? Cultural risks: They disregard and distort cultural identity by failing to authentically represent diverse features, traditions, and heritage.

I am a dark-skinned Black woman. I was intentional in providing input pictures that accurately reflected my skin tone, yet the outputs I received were shockingly unrecognizable. My features were altered, my skin lightened, and my identity diminished.

What makes this experience even more disheartening is the false advertising. The app prominently features images of Black women in its promotions, creating the illusion of inclusivity. But the reality of its outputs tells a different story—a story of exclusion and erasure.


This experience isn’t just about me. It’s about all of us who deserve to see ourselves authentically reflected in the tools we use. It’s about demanding accountability from AI companies. Tools like these have the power to shape societal norms and cultural perceptions—whether for better or for worse.

At Digital Colorism, my mission is to shed light on these risks and help AI companies measure and mitigate bias through frameworks rooted in fairness and inclusion. This fight is personal, but it’s also necessary. Because if AI tools can’t represent us accurately, they don’t deserve to define us.

Let’s demand better. Together.

Christelle Mombo-Zigah (wife, mother, black woman in tech, immigrant with an accent, lifelong learner, AI, GenAI, Responsible AI specialist, Founder of Digital Colorism, an impact audit that evaluates and mitigates biases in AI-image-generated tools while evaluating six key factors—colorism, texturism, featurism, ageism, sizeism, and accessibility to empowers companies to identify and address biases in their AI-image-generated tools.




Kenan Causevic

freelancer

19 小时前

colorizethis.io AI fixes this Biased AI outputs raise concerns.

赞
回复

colorizethis.io AI fixes this Frustrating reminder of biased AI.

colorizethis.io AI fixes this Frustration with biased AI outputs.

colorizethis.io AI fixes this (AI image colorization) Biased AI outputs pose risks.

colorizethis.io AI fixes this Frustrating reminder of biased AI.

要查看或添加评论,请登录

? Christelle Mombo-Zigah的更多文章

社区洞察

其他会员也浏览了