Using ChatGPT to improve accessibility and inclusion
Images without a single obvious subject, like the street scene above, can be difficult to describe in words.

Using ChatGPT to improve accessibility and inclusion

After a long period of experimentation I have found myself using ChatGPT for a small number of tasks – all of which are more administrative than creative.

First, there’s writing alt text. This became an option with GPT-4 which allowed the upload of images alongside prompts.

I still write most of my alt text, or image descriptions, manually. It’s usually quicker and easier to do so than to fire up ChatGPT. And, with ethics and the environmental costs of generative AI in mind, I try not to use it casually or carelessly.

In a few cases, though, I’ve come across an image sufficiently complex that describing it meaningfully feels like a challenge. Especially if I’m pressed for time.

In the past, that might have caused me not to post the image at all, but now I can upload it to ChatGPT and prompt “Please write alt text for this image.”

Here’s an example.

Torn posters on a wall creating a sort of collage, described in more detail in the text of the article.
A photo taken in a pedestrian underpass in Bristol, described below in various ways.


  • My draft alt text: “Torn and overlapping posters on a wall. There are jagged strips of white backing paper. One one side are parts of two heads, one with white hair and the other with black hair.”
  • ChatGPT: “A partially torn and layered collage featuring fragments of photographs. The left side of the image is dominated by black and yellow textured patterns, while the right side shows pieces of a human face, including an ear and part of a forehead, with visible signs of wear and tear. The overall effect is abstract and distressed, blending human elements with artistic textures.”

And here's what it came up with for the main image illustrating this article:

A street scene showing two large advertisements for food. The left side of the image features a close-up of a burger with melted cheese, pickles, and a beef patty against a red background. The right side shows a storefront with a poster advertising sandwiches and a cash machine offering free cash withdrawals from Sainsbury's Bank. The pavement in front of the storefront is empty.

You can also tweak your alt text in various interesting ways:

  • Specify fewer than 75 characters: “A torn collage blending textured patterns and fragments of human faces.”
  • Specify that the text should convey why the image is interesting or beautiful: “A torn, layered collage featuring fragments of a photograph of two people. The image is obscured by a dramatic overlay of black and yellow abstract textures, creating a striking contrast between the familiar human elements and the chaotic artistic intervention. The visible details of faces and clothing are partially concealed, adding a sense of mystery and depth to the composition. The juxtaposition of the raw, torn edges with the smooth photographic surface makes the piece intriguing and visually compelling.”
  • Specify plain, simple language: “A ripped photo showing parts of two people's faces. One person has white hair. The photo has black and yellow paint on it. Some parts are missing. The edges are torn.”

Ethical concerns versus accessibility

Alt text is important. It's how we make sure everyone is included and can engage with our content, whether it's important, artistic, or totally trivial.

But the fact is that, at present, many people don’t add alt text because (a) they’re not confident writing descriptions; (b) they regard it as too much trouble; or (c) they simply haven’t formed the habit.

If ChatGPT or similar tools make it easier for people to add alt text to images the benefits in terms of accessibility and inclusion arguably outweigh ethical concerns.

My guess is that alt text will soon be automated using generative AI in, for example, social media apps, and content management systems. It’s important that a human reviews the alt text, of course – and the more abstract the image, the more likely it is to be wrong.

For more on using ChatGPT to write meaningful alt text check out a recent post by my friend Tom Goskar exploring its uses in the context of archaeology and museum curation .

Filling in forms

A popular low-key deceptive service pattern is forcing users to enter text into form fields. It reduces the number of requests by putting an obstacle in the way.

For example, my local council has an excellent service for reporting illegal rubbish dumping on the streets (fly tipping). But there is a form to fill in every single time, for every single discrete instance of rubbish being dumped.

This is a pain and, while I’m sure it’s there for good reasons, including to prevent spam reports, it does make me less likely to complete a report.

Yesterday, I tried an experiment: I took a photo of some rubbish and asked ChatGPT to write the description for the form.

Black bin bags, abandoned shopping trolleys, and a mysterious pile of waste covered with an orange sheet.
A pile of rubbish on a street corner near my house.

ChatGPT: “The rubbish includes several black bin bags, an orange tarp, and a pile of shopping trolleys. The items are located on the pavement next to the wall of a building, creating an obstruction and an eyesore.”

I don’t intend to do this often – only when describing a particularly complex pile of rubbish temporarily defeats me. But this might potentially open up the option of reporting to people less confident with words, or less confident with English.

And, again, I can imagine this working well as a button on the form itself, perhaps using a model trained on images of incidents of fly tipping.

Opting out of AI

Finally, and slightly mischievously, I’d like to suggest that ChatGPT can help us opt out of having our data used to train AI tools.

Specifically, I’m thinking of another deliberately annoying form used by Meta (Facebook) to make it harder for people to opt out of their data being used to train AI.

It requires users to provide a written statement explaining why they don’t want their data to be used. For many people, this will be a difficult enough task that they just think, “Oh, never mind”, and passively submit.

But ChatGPT can help:

“I don't want my Facebook data used to train AI because it involves my personal information, which I value and want to keep private. I am concerned about how my data might be used, shared, and potentially misused without my explicit consent. Additionally, I am wary of the security risks and potential biases that could arise from my data being included in AI training sets. I believe in maintaining control over my personal information and ensuring that it is not utilised in ways that could affect my privacy and autonomy.”

As it happens, I wrote my own objection, and doing so helped me think about the problem. But in many cases, filling in forms isn't a learning a journey – it's a pain in the neck.

If ChatGPT or similar tools can shortcut that, and empower users, that's definitely something for the plus column.

Neil Hopkins

Senior strategic communications professional

6 个月

This is outstanding, Ray - thank you!

要查看或添加评论,请登录

社区洞察

其他会员也浏览了