Skip to content
Technical GlossaryGenerative AI and LLM

Uncertainty Calibration

A quality approach aimed at making model confidence better aligned with actual correctness.

Uncertainty calibration aims to reduce the gap between how confident a model appears and how correct it actually is. Overconfident wrong answers quickly erode user trust. For this reason, calibration becomes an important safety component in decision support and knowledge generation systems.