# ReLU Activation

> Source: https://sukruyusufkaya.com/en/glossary/relu-activation
> Updated: 2026-05-13T21:00:54.895Z
> Type: glossary
> Category: derin-ogrenme
**TLDR:** The most common modern activation function, which zeros negative inputs and leaves positive inputs linear.

<p>ReLU is widely regarded as one of the most effective practical discoveries in modern deep learning. It is computationally simple, derivative-efficient, and helps training in deep networks. It has become the standard choice especially in CNNs and fully connected architectures. However, the so-called dying ReLU problem, where some neurons remain permanently in the negative region and stop learning, must be monitored carefully.</p>