# ELU

> Source: https://sukruyusufkaya.com/en/glossary/elu
> Updated: 2026-05-13T21:07:01.055Z
> Type: glossary
> Category: derin-ogrenme
**TLDR:** An activation function that uses smooth exponential behavior in the negative region to encourage a more balanced activation distribution.

<p>ELU uses a smooth exponential structure in the negative region instead of hard zeroing, which can help produce a more balanced distribution of activations. This may allow representations to behave closer to being zero-centered and can facilitate learning in some cases. Although slightly more expensive than ReLU computationally, it may offer more stable optimization in certain architectures.</p>