# Cross-Entropy Loss

> Source: https://sukruyusufkaya.com/en/glossary/capraz-entropi-kaybi
> Updated: 2026-05-13T21:09:02.203Z
> Type: glossary
> Category: matematik-istatistik-optimizasyon
**TLDR:** A core classification loss that measures the mismatch between the true distribution and the model’s predicted probability distribution.

<p>Cross-entropy loss is one of the most widely used loss functions in classification problems. Its core idea is to measure how much confidence the model assigns to the correct class and how much probability mass it spreads across incorrect classes. If the model gives a low probability to the true class, the loss increases. This encourages the system not only to predict the correct label, but also to become confidently correct. It is indispensable in logistic regression, neural networks, and multiclass classification.</p>