Skip to content

BFGS

A quasi-Newton optimization method that approximates second-order information to achieve efficient convergence.

BFGS is a quasi-Newton optimization algorithm that tries to preserve the strength of Newton’s method while reducing the cost of Hessian computation. Instead of computing exact second-order information, it uses clever approximations to accelerate optimization. This makes it highly effective for many smooth, medium-scale problems. It performs well in statistical estimation, engineering optimization, and some classical machine learning problems. BFGS is a successful example of the principle: if exact second-order information is too expensive, use a strong approximation.