Rethinking AI Terminology: Is It 'Hallucination' or 'Delusion', or Neither?

Rethinking AI Terminology: Is It 'Hallucination' or 'Delusion', or Neither?

Recently, during a conversation with a psychiatrist (Hi, Mom!), I shared some captivating concepts from the AI world, noting how most of them are inspired by our own human psychology. Then, I playfully suggested, “Maybe AI will evolve to mirror humans so closely that it could even develop psychological disorders!" This led to explaining the idea of 'AI hallucination'. After pondering for a moment, she responded thoughtfully, "But doesn't that sound more like a delusion rather than hallucination?”. Her comment sparked my curiosity further, leading me to explore this topic in more depth. I'm eager to share my findings and insights with you and would love to hear your thoughts on this intriguing subject.


What is AI Hallucination?

AI hallucination refers to a phenomenon where an artificial intelligence system, such as a language model, generates output that is incorrect, nonsensical, or unrelated to the input it received. [1] For instance;?If I would ask what a “Flufflepuff” is, and the AI model answers “Flufflepuff is a small furry animal with big ears." this would be a hallucination. However, a proper response would be to clarify that the animal does not exist.


Hallucination and Delusion in Psychology

In our daily life, our sensory organs are triggered by external stimuli and after a bunch of complex processing of that stimulus, there will be an output. Here is a quick diagram:

A basic diagram of how sensory processes look like



However, hallucinations are experiences where someone sees, hears, smells, tastes, or feels something that isn’t actually there. DSM-5 defines hallucination as “A perception-like experience with the clarity and impact of a true perception but without the external stimulation of the relevant sensory organ.”[2] Examples can be, a person who sees a non-existent cat; or hearing voices that tell them to do things.


A basic diagram of how hallucination looks like



Delusion, in the field of psychology, refers to a belief that is not supported by evidence and is often considered irrational or unrealistic.[2] Imagine someone who thinks that, they are the secret CEO of the biggest company in the world. Consequently, this belief changes the way that they behave and think. Notably, the manifestation of a delusion doesn't always involve sensory experiences, as it primarily operates within the realm of cognitive processes.




So, Which One?

My thoughts on "AI Hallucination":

  • Hallucinations in humans involve sensory experiences. This raises the question: do AI models undergo 'sensory experiences'? Is it accurate to say they 'see' incorrect patterns, akin to human sensory hallucinations?
  • Hallucination implies a mismatch between input and perceived output. For instance, a nonexistent house is seen as existing. Similarly, when AI produces outputs disconnected from the inputs, this parallels how human hallucinations involve perceiving something absent – bolstering the use of "hallucination" in this context.
  • AI systems occasionally generate outputs that lack factual basis or input correlation. These instances resemble human hallucinations: sensory experiences without external stimuli.

My thoughts on "AI Delusion":

  • Delusions involve errors in cognitive processing. This leads us to question whether AI systems engage in cognitive processes similar to human belief systems. Is labeling AI behaviors as "delusions" appropriate in this light?
  • Delusions typically display a persistent nature and resistance to correction, traits not commonly seen in current AI models. This difference may challenge the applicability of "delusion" in the AI context.
  • Delusions cover a broader spectrum of errors than hallucinations, extending beyond sensory misperceptions to include judgment and interpretation errors. Given that AI systems not only perceive but also process information and make decisions, "delusion" might be a more encompassing term.



Final Remarks: Navigating the Unknown with Careful Analogies

As humans, we are doing the same thing that we have been doing for centuries; creating analogies for the "unknown" derived from our own experiences. This practice is more than just a habit; it's a fundamental approach that helps us understand and interact with the world around us. However, it's crucial to recognize that this method has its limitations.

Limited Scope of Analogies: While they provide a familiar framework for understanding, analogies may fall short in capturing the true nature of what they aim to represent. This limitation is particularly evident when dealing with complex, non-human entities like AI systems.

The Pitfalls of Derived Terminologies: Analogies often give birth to new terminologies. These terms, while rooted in our effort to understand, can inadvertently lead to misconceptions. They might encapsulate only a fragment of the entity's actual experience or functionality, leading us astray.

In summary, as we continue to explore and define the realm of AI, we must be mindful of the analogies we employ. Our challenge is to use these tools wisely, always being open to the possibility that the true nature of AI may be beyond the scope of our current analogies and terms. Therefore, I believe that introducing a new term for this situation could be beneficial. It would help us avoid limiting our understanding and interpretation.


What do you think? ?? I am eager to hear your perspectives on this topic, which was the whole purpose of this LinkedIn article. Feel free to drop a comment below or send me a direct message. Let's keep the conversation going! ??


Thank you! ??


Footnote

[1] What are AI hallucinations?. IBM. (n.d.). https://www.ibm.com/topics/ai-hallucinations

[2] The Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM–5; American Psychiatric Association, 2013)

要查看或添加评论,请登录

Nektar Ege ALTINTOPRAK的更多文章

  • Yapay zeka (AI) ve tasar?m

    Yapay zeka (AI) ve tasar?m

    Yapay zekan?n (AI), ?zellikle de ChatGPT'nin sosyal medya dünyas?n? nas?l kas?p kavurdu?unu fark ettiniz mi? Her gün…

  • WhatsApp Notes

    WhatsApp Notes

    Collaborative features of products are getting tremendous attention day by day. During our analysis and exploration, my…

    13 条评论
  • Amerika'daki Yüksek Lisans Programlar? Hakk?nda

    Amerika'daki Yüksek Lisans Programlar? Hakk?nda

    Yüksek lisansa ba?vurmak isteyenlerin de bildi?i gibi, okullara ba?vuru sezonu a??ld?! Bu süreci 1.5 sene ?nce ya?ayan…

  • Hear Me; A mini project

    Hear Me; A mini project

    Have you ever designed an app that you wish nobody will ever need to use? I recently did after a catastrophic…

    4 条评论

社区洞察

其他会员也浏览了