Technical GlossaryGenerative AI and LLM
Citation Grounding
An approach that improves trust by explicitly showing the source passages supporting the generated answer.
Citation grounding allows users to see what source passages support the model’s answer. This not only increases trust, but also makes verification and error analysis easier. It is especially important in enterprise, academic, and regulation-sensitive use cases.
You Might Also Like
Explore these concepts to continue your artificial intelligence journey.
