# Jensen-Shannon Divergence

> Source: https://sukruyusufkaya.com/en/glossary/jensen-shannon-divergence
> Updated: 2026-05-13T21:05:36.518Z
> Type: glossary
> Category: matematik-istatistik-optimizasyon
**TLDR:** An information-theoretic divergence measure that compares two distributions in a more symmetric and stable way.

<p>Jensen-Shannon Divergence can be viewed as a more balanced and symmetric variation of KL Divergence. It compares two distributions in a more stable way and considers both directions together. This makes it useful in distribution comparison, generative modeling, and similarity analysis. It offers a good balance between theoretical meaning and practical behavior. In information theory, it provides a softer and more interpretable answer to the question: “How different are these two distributions?”</p>