# Leaky ReLU

> Source: https://sukruyusufkaya.com/en/glossary/leaky-relu
> Updated: 2026-05-13T21:01:01.204Z
> Type: glossary
> Category: derin-ogrenme
**TLDR:** An activation function that leaves a small nonzero slope in the negative region to alleviate the dying ReLU problem.

<p>Leaky ReLU was developed to reduce the learning loss caused by standard ReLU zeroing out everything in the negative region. By preserving a small slope for negative inputs, it can prevent some neurons from becoming permanently inactive. It may provide more stable training behavior, especially under fragile optimization conditions or in smaller-data scenarios.</p>