Skip to content

Entropy

A fundamental information-theoretic concept that measures uncertainty, disorder, or information content in a probability distribution.

Entropy is one of the most fundamental concepts in information theory and measures how much uncertainty is present in a system. The more balanced and unpredictable the probabilities are, the higher the entropy becomes. Conversely, when one outcome dominates, entropy is lower. It is used in areas ranging from decision trees and language models to compression and uncertainty measurement. Entropy is not just a technical computation; it is the numerical expression of the question: “How much do we not know about this system?”