# Weight Decay

> Source: https://sukruyusufkaya.com/en/glossary/weight-decay
> Updated: 2026-05-13T20:58:38.716Z
> Type: glossary
> Category: derin-ogrenme
**TLDR:** A regularization approach that penalizes weight magnitude in order to control model complexity.

<p>Weight decay helps the model find simpler and more generalizable solutions by discouraging unnecessarily large parameter values. In practice, it is often associated with L2 regularization and is widely used to reduce overfitting in high-capacity deep networks. It can prevent weights from becoming excessively sensitive to the training data. Its effect is best understood when considered jointly with the optimization algorithm.</p>