Technical GlossaryData Science and Data Management
Normalization
The process of bringing numerical variables to a defined scale to make them more suitable for modeling and comparison.
Normalization aims to place numerical variables with different ranges onto a common scale, making model learning more balanced. Distance-based algorithms, neural networks, and optimization-driven models are often sensitive to scale differences. It is undesirable for a feature with large magnitude to dominate others purely because of its scale. For this reason, techniques such as min-max normalization are widely used. However, the decision to normalize should always consider the data distribution and the type of model being used.
You Might Also Like
Explore these concepts to continue your artificial intelligence journey.
