#68: It’s all about the prompt
Ewan Dalton
I help Microsoft Partners with strategy, GTM, roadmap & sales alignment | ex-MSFT | IAMCP UK board
When internet search engines took off in the mid 90s – remember Alta Vista? – and Google exploded into the public consciousness in the early 2000s, it became increasingly apparent that getting good search results were helped by being able to ask your question correctly.
Savvy searchers might use a combination of quotes and other “operators” to specify an exact phrase, or guide the search engine to include only certain terms or results from a particular website (such as site:tipoweek.com onenote). Google and Bing both tend to use the same operators (so, as Scott Hanselman would say, you could “Google with Bing”).
Prompting Today
When using some of the many AI tools such as ChatGPT, Copilot, Gemini etc, you can get very relevant results by being quite specific in what you ask it to do. As an example, one of the best ToW banner images was created using Microsoft Designer with the prompt, "a serene image of a young boy sitting at an old laptop (with Windows 10) but lurking in the dark background is the grim reaper"
Or, getting much more detailed, see Kat Beedim’s detailed 200+ word instructions to create consistently-formatted notes from meeting transcripts.
Being much more verbose and directional than you’d ever try in a regular search engine can give some quite remarkable results. The order of what you ask might vary the emphasis given to certain parts of the response, and the general advice is to be positive – i.e. ask for things you want, rather than telling it what you don’t want.
It seems that AI can suffer from a variant of Dostoevsky’s “White Bear Problem”; ie. Asking it not to do something increases the likelihood of doing it. Not long after Microsoft went big on Copilot and Designer, here’s one example when Copilot was asked to draw an image on a particular topic…
The idea was to convey a background threat with those hooded figures, not the feeling that the poor girl was in imminent peril. The figures lurking in the background might be a mite less sinister if they weren’t armed, so clarification was called for…
Maybe DALL-E 3 at that time was just fixated with firearms, or asking it not to do something was a step too far. We’ve gone from “some guns” to “pointing guns at her”. Hmmm.
Trying the same prompt in Designer today seemingly gets a little less gun-heavy, but still has the odd one creeping in. Trying to be more explicit doesn’t appear to work… ?like adding to the end of the prompt, “The sinister hooded figures are not carrying guns of any kind”.
You might think that instruction is simple enough, but no. It seems to be interpreted as “you want more guns? Gotcha”.
Further prompting
See here for some more tips on Copilot, or take a look at some pearls from the Copilot support team. Also, look out for some more in-depth instructions on using ChatGPT.
For business Copilot with M365 users, the Copilot Prompt Gallery is worth a play.
For other Copilot ideas, check out Chris Stuart Ridout talking about Prompt Buddy, a Teams app which lets users share good prompts with others in the company.
I'd look at Lyme Regis for an answer to the Dorset question. I do have some bias though, having moved here last year. ??
Healthcare Account Executive (London) at Microsoft
16 小时前I did eventually get it to swap out the aforementioned firearm. Make dental cavities, not war...
Healthcare Account Executive (London) at Microsoft
16 小时前Fixed it for you. Ran your prompt through my recently created (and somewhat popular) Copilot Promptimiser Agent and asked how to ensure the image did not feature any guns or convey that the lady was in genuine peril. It suggested specifying what the subjects in the picture should be holding. Hey presto, a less violent but somehow equally creepy result.... And only one gun! ??