This is the result of trying to code away bias. Apparently, when hard-coding Google's image generator to be less racist and generate more diverse groups of people, no one considered that there are maybe some situations that it would NOT be appropriate for a group of people to be racially diverse, and might actually be LESS appropriate if they were! The road to hell and all that... https://lnkd.in/gZzyEM3T
We are on the cusp of this technological revolution that is akin to letting the inmates out of the asylum all at the same time. Every image or video will be able to be manipulated in some way by most people any time they want and for any reason they want. We have gone from using Photoshop for magazine covers or Ads to enhance images by skilled designers - to a carte blanche technology easily used by anyone - including politicians / criminals / artists / children / etc . . . Buckle up . . . it will be a rough ride.
100%! "Unintended consequences" -- not to mention an enormous waste of energy . . . bias very likely cannot be coded-away.
This is the real moral of this story.
to many edge cases to fix this via hard coding, this is a design problem
Tackling bias in AI, especially in image generation, is a complex task. Google's effort to promote diversity shows commitment, yet underscores the importance of context. It's a delicate balance, aiming for inclusivity while respecting historical and cultural accuracy, I'd say.