# Batch Normalization

> Source: https://sukruyusufkaya.com/en/glossary/batch-normalization
> Updated: 2026-05-13T19:58:31.035Z
> Type: glossary
> Category: derin-ogrenme
**TLDR:** A technique that normalizes intermediate activations at the mini-batch level to accelerate training and provide partial regularization.

<p>Batch normalization helps reduce distribution shift inside deep networks and makes optimization more stable. By normalizing activations at the mini-batch level, it can enable training with higher learning rates. It may also introduce a mild regularization effect. It has played an important role in many architectures, from CNNs to pre-Transformer models.</p>