Technical GlossaryMathematics, Statistics and Optimization
KL Divergence
A directional divergence measure that quantifies how one probability distribution differs from another.
KL Divergence is one of the core tools of information theory for measuring the difference between two probability distributions. It is not symmetric, meaning the divergence from one distribution to another is not the same in reverse. It plays a major role in variational methods, language models, distribution matching, and information loss analysis. It is especially powerful when evaluating how close a model distribution is to the true one. KL Divergence is one of the most sophisticated answers probabilistic modeling gives to the question: “How wrong are you?”
You Might Also Like
Explore these concepts to continue your artificial intelligence journey.
