Skip to content
Technical GlossaryAI Fundamentals

Parameters and Hyperparameters

The core difference between internal values learned from data and external settings that shape the training process.

Parameters are the numerical values a model learns from data during training, such as weights and bias terms in neural networks. Hyperparameters, by contrast, are settings chosen externally, such as learning rate, batch size, number of layers, or regularization strength. This distinction matters because parameters determine the model’s internal behavior, while hyperparameters determine how that behavior is learned. Put simply, one relates to what the model learns, and the other to how it learns it. Building a strong AI system requires not only good data but also strong discipline in hyperparameter design and tuning.