Using ChatGPT to improve accessibility and inclusion
Ray Newman
Lead content design consultant at SPARCK | editor | copywriter | content marketing | writes ghost stories for fun
After a long period of experimentation I have found myself using ChatGPT for a small number of tasks – all of which are more administrative than creative.
First, there’s writing alt text. This became an option with GPT-4 which allowed the upload of images alongside prompts.
I still write most of my alt text, or image descriptions, manually. It’s usually quicker and easier to do so than to fire up ChatGPT. And, with ethics and the environmental costs of generative AI in mind, I try not to use it casually or carelessly.
In a few cases, though, I’ve come across an image sufficiently complex that describing it meaningfully feels like a challenge. Especially if I’m pressed for time.
In the past, that might have caused me not to post the image at all, but now I can upload it to ChatGPT and prompt “Please write alt text for this image.”
Here’s an example.
And here's what it came up with for the main image illustrating this article:
A street scene showing two large advertisements for food. The left side of the image features a close-up of a burger with melted cheese, pickles, and a beef patty against a red background. The right side shows a storefront with a poster advertising sandwiches and a cash machine offering free cash withdrawals from Sainsbury's Bank. The pavement in front of the storefront is empty.
You can also tweak your alt text in various interesting ways:
Ethical concerns versus accessibility
Alt text is important. It's how we make sure everyone is included and can engage with our content, whether it's important, artistic, or totally trivial.
But the fact is that, at present, many people don’t add alt text because (a) they’re not confident writing descriptions; (b) they regard it as too much trouble; or (c) they simply haven’t formed the habit.
If ChatGPT or similar tools make it easier for people to add alt text to images the benefits in terms of accessibility and inclusion arguably outweigh ethical concerns.
My guess is that alt text will soon be automated using generative AI in, for example, social media apps, and content management systems. It’s important that a human reviews the alt text, of course – and the more abstract the image, the more likely it is to be wrong.
领英推荐
For more on using ChatGPT to write meaningful alt text check out a recent post by my friend Tom Goskar exploring its uses in the context of archaeology and museum curation .
Filling in forms
A popular low-key deceptive service pattern is forcing users to enter text into form fields. It reduces the number of requests by putting an obstacle in the way.
For example, my local council has an excellent service for reporting illegal rubbish dumping on the streets (fly tipping). But there is a form to fill in every single time, for every single discrete instance of rubbish being dumped.
This is a pain and, while I’m sure it’s there for good reasons, including to prevent spam reports, it does make me less likely to complete a report.
Yesterday, I tried an experiment: I took a photo of some rubbish and asked ChatGPT to write the description for the form.
ChatGPT: “The rubbish includes several black bin bags, an orange tarp, and a pile of shopping trolleys. The items are located on the pavement next to the wall of a building, creating an obstruction and an eyesore.”
I don’t intend to do this often – only when describing a particularly complex pile of rubbish temporarily defeats me. But this might potentially open up the option of reporting to people less confident with words, or less confident with English.
And, again, I can imagine this working well as a button on the form itself, perhaps using a model trained on images of incidents of fly tipping.
Opting out of AI
Finally, and slightly mischievously, I’d like to suggest that ChatGPT can help us opt out of having our data used to train AI tools.
Specifically, I’m thinking of another deliberately annoying form used by Meta (Facebook) to make it harder for people to opt out of their data being used to train AI.
It requires users to provide a written statement explaining why they don’t want their data to be used. For many people, this will be a difficult enough task that they just think, “Oh, never mind”, and passively submit.
But ChatGPT can help:
“I don't want my Facebook data used to train AI because it involves my personal information, which I value and want to keep private. I am concerned about how my data might be used, shared, and potentially misused without my explicit consent. Additionally, I am wary of the security risks and potential biases that could arise from my data being included in AI training sets. I believe in maintaining control over my personal information and ensuring that it is not utilised in ways that could affect my privacy and autonomy.”
As it happens, I wrote my own objection, and doing so helped me think about the problem. But in many cases, filling in forms isn't a learning a journey – it's a pain in the neck.
If ChatGPT or similar tools can shortcut that, and empower users, that's definitely something for the plus column.
Senior strategic communications professional
6 个月This is outstanding, Ray - thank you!