Skip to content
Technical GlossaryAI Fundamentals

Regularization

A set of techniques used to reduce overfitting and improve a model’s ability to generalize.

Regularization is the general name for a group of techniques used to help a model respond well not only to training data but also to new data. The goal is to prevent the model from becoming overly complex and fragile. Methods such as L1, L2, dropout, and early stopping can all be understood within this framework. In a sense, regularization tells the model: “do not memorize every detail; focus on the more meaningful patterns.” That is why it is a core tool for anyone who wants to build systems with strong generalization.