A.I. doesn't guess, it 'Danotes'
Note: This is an exercise in unpacking our inclination to anthropomorphize, not an attempt to coin the Word of the Year!
Imperfection
Six months into the public release of conversational AI and chatbots, we are still grappling with how to describe their outputs.
People say Chat GPT told them this, thinks that, or is guessing.
However, terms like "said," "told," "thinks," and "guessing" describe human cognitive functions. By using them, we inadvertently attribute more sentience to these systems than is warranted.
AI responses can often be incorrect.
Since AI lacks true sentience, it's not accurate to say it's guessing or thinking. We need terminology that reflects the machine's functionality, not human characteristics.
Enter the word 'danote.'
'Danote' encapsulates the computational operation where AI parses a question and provides an answer. The word is constructed by combining data and conoting.
"Chat GPT danoted three stocks related to lithium."
领英推荐
"Chat GPT danoted the history of the Rohingya incorrectly."
"Chat GPT danoted options for Saudi PIF spending quite competently."
By using terms that differentiate between computer and human functions, we maintain a useful distinction, which is important in examples like these:
We need terminology that reflects the machine's functionality, not human characteristics.
Anyway, Danote.
Word of the Year.
Max Gadney is a design director and product development consultant.
Get in touch.