Technical GlossaryGenerative AI and LLM
Hallucination
The phenomenon in which a model generates fluent but unsupported or incorrect content.
Hallucination is one of the most critical reliability issues in generative AI systems. The output may be linguistically convincing while being wrong in terms of source grounding, factuality, or computation. This creates direct risk especially in enterprise, medical, legal, and financial applications.
You Might Also Like
Explore these concepts to continue your artificial intelligence journey.
