Skip to content
Technical GlossaryDeep Learning

Neural Tangent Kernel

A theoretical framework that connects the training dynamics of very wide neural networks with kernel methods.

The Neural Tangent Kernel is an advanced theoretical framework that helps explain why very wide neural networks can behave in unexpectedly stable ways. In the infinite-width limit, the learning dynamics can resemble those of a specific kernel machine. This idea is important for understanding deep learning not only empirically but also mathematically. It plays a central role in research on width, optimization, and generalization.