Skip to content

Jensen-Shannon Divergence

An information-theoretic divergence measure that compares two distributions in a more symmetric and stable way.

Jensen-Shannon Divergence can be viewed as a more balanced and symmetric variation of KL Divergence. It compares two distributions in a more stable way and considers both directions together. This makes it useful in distribution comparison, generative modeling, and similarity analysis. It offers a good balance between theoretical meaning and practical behavior. In information theory, it provides a softer and more interpretable answer to the question: “How different are these two distributions?”