Skip to content

Eigenvalue and Eigenvector

Vectors that preserve their direction under a linear transformation, along with the associated scaling factors.

An eigenvector is a special vector that preserves its direction after a linear transformation defined by a matrix, while the eigenvalue indicates how much it is scaled. Although these concepts may seem abstract, they are highly practical in data science and machine learning. In methods such as PCA, eigenvalue-eigenvector decomposition is used to identify the directions that capture the most variance in the data. These ideas are extremely useful for understanding system behavior, data orientation, and transformation effects.