Skip to content

Gradient Descent

A fundamental optimization method that updates parameters in the opposite direction of the gradient to minimize a loss function.

Gradient descent is one of the most fundamental optimization methods in machine learning and deep learning. Its goal is to update model parameters step by step in a direction that reduces the loss function. To do this, it uses gradient information that describes the slope of the function and moves in the direction of steepest descent. Although the logic appears simple, performance is strongly affected in practice by factors such as learning rate, local minima, surface geometry, and data noise. Even so, gradient descent remains the backbone of modern model training.