Skip to content
Technical GlossaryDeep Learning

Dropout

A regularization technique that reduces overfitting by temporarily disabling some neurons during training.

Dropout encourages the network to learn more generalizable representations by preventing excessive reliance on specific neurons. During training, neurons are randomly disabled so that the model must learn through many implicit subnetwork configurations. This can create a strong regularization effect, especially in fully connected layers. However, the dropout rate and its placement in modern architectures should be chosen carefully.