Advancing AI with an Eye on Bias - Navigating the Complexities
A human figure with robotic hands manipulates a glowing, brain-like network of interconnected nodes and circuits, symbolizing direct control ove

Advancing AI with an Eye on Bias - Navigating the Complexities

ChatGPT has just released its latest version, 4o (that's "O" as in Orange).? It offers the same functionalities as GPT-4 but with faster and more optimized performance. The new automated voice is a pleasant addition; I enjoy the sound and emotive qualities they've built into the system.? I've been playing around with it, prompting it to write creatively, review my code, make and automate all kinds of administrative labor I try to avoid :)

If you haven't already, you should watch the Spring Update demo to see how people are using this tech and the assumptions behind its design. One particular incident stood out during the demo. A researcher was doing a breathing exercise and began purposely breathing in an exaggerated manner. ChatGPT told him to calm down, joking that he "wasn't a vacuum." Later, after GPT 4o sang (yes, it sings now), researchers demonstrated the vision component of the LLM by having ChatGPT determine a user’s emotion based on their smile. Pretty cool, huh? Sure, but (and it’s a big BUTT...er...a...BUT)

What if your expressions don't fit expected patterns? How will facial recognition, voice detection,? and? similar functionalities be effective and equitable for everyone and accurately reflect diversity in user behaviors and characteristics?

Sherry Byrne Haber, an award-winning values-based engineering, accessibility, and inclusion leader, and one of the most prolific writers I know, wrote about similar concerns in 2019. In her article Disability and AI Bias,? Sherry noted that while concerns about gender and race discrimination in facial recognition technology are common, fewer people are examining AI bias against people with disabilities. She further stated that this bias is even more problematic because the characteristics of disabilities are so diverse.?

We’ve programmed these biases into all of our software - Large language models (LLMs), self-driving cars, image creators, voice changing apps and the like.? Furthermore, as we do so, we are actively defining the manner in which people must use technology without always recognizing how people can use technology.? Keep in mind,? I don’t believe, as a whole, people are going out of their way to be biased or to exclude those folks who are not exactly like them.? In many cases it’s an afterthought for people individually and for companies professionally.? That kind of lack of forethought has consequences. As Sherry mentioned in the same article,, San Francisco in May of 2019 passed an ordinance banning the city government from using facial recognition technology, mainly due to concerns about gender and race discrimination. One frequently cited reason for the higher rates of false positives for women, especially women of color, is the lack of robust data sets. If we think about these concerns, the issue for people with disabilities is much worse. There are fewer people with disabilities compared to women and people of color, and their characteristics vary widely.

Now, the good news :)? There are so many ways companies can create and implement truly inclusive and equitable technology.

  1. Think equitably. Include persons with disabilities when thinking about designing an app. By considering diverse user needs from the start, you create more inclusive and user-friendly applications that cater to a wider audience, increasing user satisfaction and broadening your market reach. Results: Apps that are accessible from the outset reduce the need for costly retrofits and redesigns, ensuring smoother user experiences and fostering brand loyalty.
  2. Implement methodologies and processes that include testing tools and software for use by persons with disabilities. Allocate time for accessibility testing, and make it part of UX design. Use screen readers, magnify the screen, and operate the computer without a mouse. This approach identifies potential barriers early in the development process, allowing for timely adjustments and improvements and, over time,? will result in enhanced usability for all users, compliance with legal accessibility standards, and a reputation for social responsibility and inclusivity.
  3. Test often and test diversely. Real-world feedback from diverse users ensures the software meets a wide range of needs and scenarios and will result in products that are robust, versatile, and capable of equitably serving users and are likely to result in higher adoption rates and positive user reviews.

Remember that on the other side of exciting new features and improvements, there are also ongoing concerns about representation. Moving forward, I believe it's crucial to strive for innovations inclusive of the diversity of all users. Doing so not only enhances user experience, but also fosters inclusive and socially responsible implementation of tech across the industry.

Chat GPT 4o, sing that to me in a pleasant Baritone voice, please. :)

Philipp Christoph Tautz

Accessibility practitioner, web designer and web all-rounder.

10 个月

Awesome! Thank you for this

要查看或添加评论,请登录

Dr. Keith Newton的更多文章

社区洞察

其他会员也浏览了