Tutorial
Understanding Eigenvalues and Eigenvectors: A Practical Guide
Published 15 March 2026 · 10 min read
Eigenvalues and eigenvectors are among the most important concepts in linear algebra. They reveal the fundamental behaviour of linear transformations, showing which directions are preserved when a matrix acts on a vector. Far from being purely abstract, eigenvalues and eigenvectors power real-world applications from Google's PageRank algorithm to principal component analysis in data science.
This guide explains what eigenvalues and eigenvectors are, walks through the computation process for 2x2 and 3x3 matrices, and explores their practical applications. Every step is shown with clear mathematical notation so you can follow along and build genuine understanding.
What Are Eigenvalues and Eigenvectors?
When a matrix multiplies a vector , the result is usually a new vector pointing in a different direction. However, for certain special vectors, the matrix simply scales them without changing their direction. These special vectors are called eigenvectors, and the scaling factors are called eigenvalues.
The Eigenvalue Equation
- is an square matrix
- is a non-zero eigenvector
- (lambda) is the corresponding eigenvalue
In words: multiplying the matrix by the eigenvector produces the same result as multiplying by the scalar . The vector's direction is unchanged; only its magnitude is scaled.
How to Find Eigenvalues
To find eigenvalues, we rearrange the eigenvalue equation and solve the resulting polynomial:
- Start with and rearrange:
- For non-trivial solutions, the matrix must be singular, meaning its determinant is zero:
- This produces the characteristic equation, a polynomial in . Solve it to find the eigenvalues.
Example 1: 2x2 Matrix
Find the eigenvalues and eigenvectors of:
Step 1: Find the eigenvalues
Set up the characteristic equation:
The eigenvalues are and .
Step 2: Find the eigenvectors
For , solve :
From the first row: , so .
For , solve :
From the first row: , so .
Example 2: 3x3 Matrix
Find the eigenvalues of the diagonal matrix:
For diagonal matrices, the eigenvalues are simply the diagonal entries. The characteristic equation is:
The eigenvalues are , , and . The corresponding eigenvectors are the standard basis vectors .
Now consider a non-diagonal 3x3 matrix:
Since this is an upper triangular matrix, the eigenvalues are still the diagonal entries:
The eigenvalues are (with algebraic multiplicity 2) and (with algebraic multiplicity 1).
Key Properties
- Trace: The sum of eigenvalues equals the trace (sum of diagonal elements) of the matrix: .
- Determinant: The product of eigenvalues equals the determinant: .
- Invertibility: A matrix is invertible if and only if none of its eigenvalues are zero.
- Symmetric matrices: Real symmetric matrices always have real eigenvalues and orthogonal eigenvectors.
- Powers: If is an eigenvalue of , then is an eigenvalue of .
Real-World Applications
Principal Component Analysis (PCA)
PCA is the most widely used dimensionality reduction technique in data science. It works by computing the eigenvalues and eigenvectors of the data's covariance matrix. The eigenvectors point in the directions of maximum variance, and the eigenvalues tell you how much variance each direction captures. By keeping only the top eigenvectors (those with the largest eigenvalues), you can reduce a high-dimensional dataset to fewer dimensions while preserving the most important patterns.
Google's PageRank Algorithm
PageRank models the web as a giant matrix where each entry represents a link between pages. The dominant eigenvector of this matrix (the one corresponding to the largest eigenvalue) gives the relative importance of each page. Pages pointed to by many important pages get higher eigenvector components, and therefore higher rankings.
Vibration Analysis (Engineering)
In structural engineering, eigenvalues of the stiffness and mass matrices determine the natural frequencies at which a structure vibrates. Engineers use these to ensure that bridges, buildings, and aircraft do not resonate at frequencies that could cause catastrophic failure.
Quantum Mechanics
In quantum mechanics, observable quantities (like energy, momentum, and spin) are represented by operators. The eigenvalues of these operators are the possible measurement outcomes, and the eigenvectors represent the corresponding quantum states.
Common Mistakes
Forgetting to subtract lambda from every diagonal entry
When forming , subtract from every diagonal element. Missing one will give an incorrect characteristic equation.
Sign errors in the determinant
For 3x3 matrices, the determinant involves six terms with alternating signs. Double check the cofactor expansion carefully.
Treating the zero vector as an eigenvector
By definition, eigenvectors must be non-zero. The equation holds for any , so the zero vector is excluded.
Confusing algebraic and geometric multiplicity
The algebraic multiplicity is how many times an eigenvalue appears as a root of the characteristic polynomial. The geometric multiplicity is the dimension of the eigenspace. They can differ for non-diagonalisable matrices.
Try It Yourself
Ready to compute eigenvalues and eigenvectors for your own matrices? Our eigenvalue calculator handles 2x2 and 3x3 matrices with step-by-step solutions.
Open Eigenvalue CalculatorRelated Articles
How to Solve Systems of Equations: 3 Methods with Examples
Master three methods for solving systems of linear equations: substitution, elimination, and matrices. Step-by-step worked examples for each approach.
A Guide to Matrix Operations: Addition, Multiplication, and Inverses
Master matrix operations including addition, scalar multiplication, matrix multiplication, determinants, and inverses with step-by-step examples.
How to Solve Quadratic Equations: 3 Methods with Examples
Learn how to solve quadratic equations using the quadratic formula, factoring, and completing the square. Step-by-step worked examples with clear explanations.