Skip to content

Perplexity

A measure, especially common in language models, that summarizes how surprised a probability model is by the data.

Perplexity is an important metric, especially in language modeling, used to evaluate how well a model predicts the data. Intuitively, it reflects how many plausible choices the model seems to be uncertain among when predicting the next item. Lower perplexity suggests more confident and better predictions. It is closely connected to information theory and strongly related to cross-entropy. Although it does not fully capture quality on its own, it remains one of the core performance measures in language modeling.