Skip to content
Technical GlossaryDeep Learning

Leaky ReLU

An activation function that leaves a small nonzero slope in the negative region to alleviate the dying ReLU problem.

Leaky ReLU was developed to reduce the learning loss caused by standard ReLU zeroing out everything in the negative region. By preserving a small slope for negative inputs, it can prevent some neurons from becoming permanently inactive. It may provide more stable training behavior, especially under fragile optimization conditions or in smaller-data scenarios.