Skip to content
Technical GlossaryDeep Learning

Overparameterization

The condition in which a model has a parameter capacity far larger than the amount of available data.

Overparameterization may seem risky from a classical statistical perspective, yet it sits at the heart of modern deep learning. Very large networks offer not only memorization capacity but also better optimization pathways and more flexible representation spaces. For that reason, the relationship between model size and generalization is more complex than it first appears. It remains one of the central questions in deep learning theory.