Technical GlossaryMathematics, Statistics and Optimization
Mutual Information
A concept that measures how much knowing one variable reduces uncertainty about another.
Mutual information is used to understand how much information two variables share. If knowing one variable makes it easier to predict the other, they have high mutual information. Unlike simpler measures, it is not limited to linear relationships and can capture more complex forms of dependency. This makes it powerful for feature selection, information gain, and distributional dependence analysis. In AI, it is especially valuable for assessing how much useful information a feature truly carries about a target.
You Might Also Like
Explore these concepts to continue your artificial intelligence journey.
