# Hallucination

> Source: https://sukruyusufkaya.com/en/glossary/hallucination
> Updated: 2026-05-13T20:57:44.749Z
> Type: glossary
> Category: uretken-yapay-zeka-ve-llm
**TLDR:** The phenomenon in which a model generates fluent but unsupported or incorrect content.

<p>Hallucination is one of the most critical reliability issues in generative AI systems. The output may be linguistically convincing while being wrong in terms of source grounding, factuality, or computation. This creates direct risk especially in enterprise, medical, legal, and financial applications.</p>