Rethinking AI Terminology: Is It 'Hallucination' or 'Delusion', or Neither?
Recently, during a conversation with a psychiatrist (Hi, Mom!), I shared some captivating concepts from the AI world, noting how most of them are inspired by our own human psychology. Then, I playfully suggested, “Maybe AI will evolve to mirror humans so closely that it could even develop psychological disorders!" This led to explaining the idea of 'AI hallucination'. After pondering for a moment, she responded thoughtfully, "But doesn't that sound more like a delusion rather than hallucination?”. Her comment sparked my curiosity further, leading me to explore this topic in more depth. I'm eager to share my findings and insights with you and would love to hear your thoughts on this intriguing subject.
What is AI Hallucination?
AI hallucination refers to a phenomenon where an artificial intelligence system, such as a language model, generates output that is incorrect, nonsensical, or unrelated to the input it received. [1] For instance;?If I would ask what a “Flufflepuff” is, and the AI model answers “Flufflepuff is a small furry animal with big ears." this would be a hallucination. However, a proper response would be to clarify that the animal does not exist.
Hallucination and Delusion in Psychology
In our daily life, our sensory organs are triggered by external stimuli and after a bunch of complex processing of that stimulus, there will be an output. Here is a quick diagram:
However, hallucinations are experiences where someone sees, hears, smells, tastes, or feels something that isn’t actually there. DSM-5 defines hallucination as “A perception-like experience with the clarity and impact of a true perception but without the external stimulation of the relevant sensory organ.”[2] Examples can be, a person who sees a non-existent cat; or hearing voices that tell them to do things.
Delusion, in the field of psychology, refers to a belief that is not supported by evidence and is often considered irrational or unrealistic.[2] Imagine someone who thinks that, they are the secret CEO of the biggest company in the world. Consequently, this belief changes the way that they behave and think. Notably, the manifestation of a delusion doesn't always involve sensory experiences, as it primarily operates within the realm of cognitive processes.
领英推荐
So, Which One?
My thoughts on "AI Hallucination":
My thoughts on "AI Delusion":
Final Remarks: Navigating the Unknown with Careful Analogies
As humans, we are doing the same thing that we have been doing for centuries; creating analogies for the "unknown" derived from our own experiences. This practice is more than just a habit; it's a fundamental approach that helps us understand and interact with the world around us. However, it's crucial to recognize that this method has its limitations.
Limited Scope of Analogies: While they provide a familiar framework for understanding, analogies may fall short in capturing the true nature of what they aim to represent. This limitation is particularly evident when dealing with complex, non-human entities like AI systems.
The Pitfalls of Derived Terminologies: Analogies often give birth to new terminologies. These terms, while rooted in our effort to understand, can inadvertently lead to misconceptions. They might encapsulate only a fragment of the entity's actual experience or functionality, leading us astray.
In summary, as we continue to explore and define the realm of AI, we must be mindful of the analogies we employ. Our challenge is to use these tools wisely, always being open to the possibility that the true nature of AI may be beyond the scope of our current analogies and terms. Therefore, I believe that introducing a new term for this situation could be beneficial. It would help us avoid limiting our understanding and interpretation.
What do you think? ?? I am eager to hear your perspectives on this topic, which was the whole purpose of this LinkedIn article. Feel free to drop a comment below or send me a direct message. Let's keep the conversation going! ??
Thank you! ??
Footnote
[1] What are AI hallucinations?. IBM. (n.d.). https://www.ibm.com/topics/ai-hallucinations
[2] The Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM–5; American Psychiatric Association, 2013)