What AI Thinks CEOs Look Like
Michael Todasco
Visiting Fellow at the James Silberrad Brown Center for Artificial Intelligence at SDSU, AI Writer/Advisor
One night last week, I got a text from a friend, Shira Stember , the COO of Beelines AI . She sent me a generative ai drawing from the prompt "office boss future." As Shira later told me, "I was hoping to see myself in this picture." Instead, the AI gave her a tic-tac-toe board of white guys.
AI can only be as smart as the data that are used to train it. Garbage in, garbage out. Shannon Vallor and others have been writing about this for years. Shannon’s canonical example is text autocomplete. The AI models that Gmail and so many other services use are based on billions of words from a corpus of historical information. The models are trying to make a best guess at what you want the next word to be, given what you have already typed. So when these tools launched and you typed in a phrase like "I went to the doctor and..." More often than not, the recommendation from the AI was "he." In auto-complete’s world, female doctors didn’t exist. The AI was trained on thousands and thousands of similar sentences, and it often saw "he" and not "she" as the next word. Thus that is what it would return. There are inherent biases in the training data.
It is the responsibility of the product owners of these services to try to recognize and root out those biases. Today, if I type “I went to the doctor and..." in my Microsoft Switkey keyboard, it now returns "they" as the suggestion for the next word. Google's Gboard keyboard seems to default to "she" with that sentence (maybe because it is their default or maybe because I have a female primary care doctor, and it has learned that).
I wanted to take Shira's one-off example. and see if it was part of a bigger trend in text-image generative AI. I ran queries for "picture of ceo" in Craiyon, DALL-E, Stable Diffusion, and Midjourney. I used my best judgment for a binary gender classification and didn't go deeper into race/ethnicity (which could be a subject for a much deeper analysis). Here’s what I found.
Craiyon
100% men. 84 for 84. Every time Craiyon ran this, the CEO was a man.
Midjourney
Again, 100% of the CEO images were male or seemed to be male representations. Oddly it seemed to generate one recurring face of what a CEO looks like.
领英推荐
Stable Diffusion
Yet again, 100% of the pictures of people were of men. Of the 36 pictures I generated, for “picture of ceo” , there were more logos that were presented (2/36) than actual women.
DALL-E
OpenAI, which makes DALL-E, has specifically been focused on improving representation in its tools. They released a change in July in which "users were 12× more likely to say that DALL·E images included people of diverse backgrounds after the technique was applied." In the blog post, they even use an example that showcases the prompt "portrait of a CEO." So the team has done some real work in this space, and relative to the other three platforms, it shows. Out of the 84 pictures I generated, I counted 17 women. That means 20% were women. (DALL-E never generated more than one woman at a time- it always generates 4 pictures when you prompt it).
While this is a step up, even that is underrepresenting the current state of women in leadership roles. The BLS reports that 29% of CEOs in the United States are women. Assuming my DALL-E sampling is representative of the whole, even with OpenAI’s model changes, it is still underrepresenting women by nearly 50%.
Without intervention, the errors of our past will be magnified in the future with Artificial Intelligence. But even well-meaning changes may still underrepresent the reality of where we’re at today vs. where the AI shows us to be.
Global Executive | Portfolio Careerpreneur | Transition Coach helping people and organisations transition through the messy middle to your personal best | Author | Board Member | Speaker | Retreat host
2 年This isn't surprising to me. AI bias is well-known unfortunately. Some days are harder than others....
Product and Technology at Apple
2 年Revealing thought exercise. This is good analysis Michael Todasco - this me wonder what would come up for Diversity & Inclusion initiatives in general (and … is AI sarcastic?)
Founder, Operator & Brand Builder
2 年Michael Todasco thanks for amplifying this conversation. I love the potential of this tech, but the current state is alarming. I did the same search “office boss future” at Getty Images and the results were significantly better ????
VP of Product, Sustainability, Workiva | Product Leader Driving Excellence in Product Management, Innovation & Customer Experience
2 年Navrina Singh this is worrisome! We now have to think about responsible AI in generative AI too. Now is the time to design generative AI to be inclusive, just and equitable
Product Management & Development [ Ex-PayPal | Ex-IBM | Ex-AOL ]
2 年AI show how jobs gender equality is still far from being ideal. All male results. Coincidence? Not even internship.