Hallucination in Generative AI ?
Arpit Agrawal
Seasoned BackEnd Java Engineer | Research-Oriented Tech Enthusiast | Cloud Specialist
What Hallucination is ?
Hallucination in generative AI refers to the phenomenon where an AI model generates information that is not based on the input data or real-world knowledge. This can results in outputs that are factually incorrect , misleading, or entirely fabricated.
Hallucination can occur in various types of generative models , including language models , image generation models and more.
Types:
What are the causes of Hallucination?
How to Mitigate Hallucination? (Mitigating strategies) :