Technical GlossaryDeep Learning
SELU Activation
An activation function designed to support self-normalizing network behavior.
SELU aims to help activations remain statistically balanced under specific initialization and architectural conditions. This idea is meant to reduce distribution drift and unstable training dynamics in very deep networks. Although it has not become a universal standard, it represents an important line of research in self-normalizing network design. It shows that activation functions are not only about nonlinearity, but also about shaping training dynamics.
You Might Also Like
Explore these concepts to continue your artificial intelligence journey.
