Eigenvalues and Eigenvectors
What You’ll Learn
Section titled “What You’ll Learn”Eigenvalues and eigenvectors reveal the “hidden structure” of a linear transformation. They describe the special directions where the transformation simply scales, without rotating or distorting.
The Concept
Section titled “The Concept”Let be a square matrix. A non-zero vector is an eigenvector of if:
where (lambda) is a scalar called the eigenvalue.
In plain terms: when you apply to an eigenvector, the result is just a scaled version of that same vector. The direction stays the same (or flips if is negative). Only the length changes.
Finding Eigenvalues
Section titled “Finding Eigenvalues”Solve the characteristic equation:
This gives you the possible values. Then for each , solve to find the eigenvectors.
What Eigenvalues Tell You
Section titled “What Eigenvalues Tell You”- : the vector is unchanged
- : the vector is stretched to double length
- : the vector is flipped (reversed direction)
- : the vector is squashed to zero (the transformation collapses that direction)
In the diagram: the matrix scales the x-direction by 2 (blue eigenvector doubles in length) and the y-direction by 0.5 (green eigenvector shrinks to half). The dashed arrows show the original vectors, the solid arrows show the result after applying . Every other vector would get rotated and distorted, but eigenvectors just scale cleanly.
Worked Examples
Section titled “Worked Examples”Example 1: Diagonal Matrix
For : , so .
For : , so .
Diagonal matrices make eigenvalues obvious: they’re just the diagonal entries.
Example 2: Finding Eigenvalues of a 2x2
Characteristic equation:
Eigenvalues: , .
Example 3: Geometric Meaning
If : the eigenvector direction gets stretched.
If : it gets compressed.
If : it gets flipped and scaled.
The three panels show what different eigenvalues do to a vector: doubles its length, halves it, and flips it to point the opposite direction.
Real-World Application
Section titled “Real-World Application”Eigenvalues and eigenvectors are used everywhere:
- Machine learning: PCA (Principal Component Analysis) finds the eigenvectors of the covariance matrix to identify the most important directions in data
- Google’s PageRank: the dominant eigenvector of the web’s link matrix determines page importance
- Game development: stability analysis in physics simulations, animation compression, vibration modes
- Quantum mechanics: observable quantities are eigenvalues of operators
Example: In character animation, eigen-decomposition compresses large motion capture datasets while preserving the most important movements. The eigenvectors with the largest eigenvalues capture the dominant motion patterns.
Retrying will remove your ✅ checkmark until you pass again.