Eigen-decomposition is typically applied to square matrices to find directions (eigenvectors) that remain unchanged (up to scaling) under the transformation represented by the matrix. For a matrix , the eigenvalue equation is . If is symmetric, its eigen-decomposition takes the form , where contains the orthogonal eigenvectors and is a diagonal matrix of eigenvalues.
Singular Value Decomposition (SVD), , provides a related but more general way to decompose any matrix , whether square or rectangular. The connection between SVD and eigen-decomposition becomes clear when we consider the matrices and . These matrices are always square and symmetric (or more precisely, positive semi-definite), which means their eigen-decomposition is well-behaved.
Let's see how the components of SVD (, , ) relate to the eigen-decomposition of these specific matrices.
Consider the matrix . Let's substitute the SVD of into this expression:
Now, multiply by :
Since is an orthogonal matrix, (the identity matrix). The expression simplifies:
Look closely at this result: . This is exactly the form of an eigen-decomposition for the symmetric matrix .
What are the diagonal entries of ? If has the singular values on its diagonal (where is the rank of ), then is an diagonal matrix with followed by zeros on its diagonal.
So, the right singular vectors of (the columns of ) are the eigenvectors of . The squared singular values of () are the eigenvalues of .
Now let's perform a similar analysis for the matrix :
Since is also an orthogonal matrix, . The expression simplifies:
Again, this result is the eigen-decomposition for the symmetric matrix .
The diagonal entries of are also the squared singular values , followed by zeros (this time it's an matrix).
So, the left singular vectors of (the columns of ) are the eigenvectors of . The squared singular values of () are also the eigenvalues of .
| SVD Component () | Relationship to Eigen-decomposition |
|---|---|
| Singular Values () | Square roots of non-zero eigenvalues of (and ) |
| Right Singular Vectors () | Eigenvectors of |
| Left Singular Vectors () | Eigenvectors of |
This relationship is significant for several reasons:
Let's verify this relationship using Python's NumPy library. We'll create a sample matrix , compute its SVD, and then compute the eigen-decompositions of and to see if the components match as expected.
import numpy as np
# Set print options for better readability
np.set_printoptions(suppress=True, precision=4)
# Create a sample non-square matrix A
A = np.array([[1, 2, 3],
[4, 5, 6]]) # A is 2x3
print("Matrix A:\n", A)
# 1. Compute SVD of A
# U will be 2x2, S will contain 2 singular values, Vh (V.T) will be 3x3
U, s, Vh = np.linalg.svd(A)
V = Vh.T # Transpose Vh to get V
Sigma_squared = s**2
print("\nSVD Results:")
print("U (Left Singular Vectors):\n", U)
print("Singular Values (s):", s)
print("Sigma^2 (Squared Singular Values):", Sigma_squared)
print("V (Right Singular Vectors):\n", V)
# 2. Compute Eigen-decomposition of A.T @ A (A^T A)
ATA = A.T @ A # ATA is 3x3
print("\nMatrix A^T A:\n", ATA)
# Use eigh for symmetric matrices (returns sorted eigenvalues/vectors)
eigenvalues_ATA, eigenvectors_ATA = np.linalg.eigh(ATA)
# Eigenvalues from eigh are sorted smallest to largest, reverse them
eigenvalues_ATA = eigenvalues_ATA[::-1]
eigenvectors_ATA = eigenvectors_ATA[:, ::-1]
print("\nEigen-decomposition of A^T A:")
print("Eigenvalues (should match Sigma^2):", eigenvalues_ATA)
print("Eigenvectors (should match columns of V):\n", eigenvectors_ATA)
# 3. Compute Eigen-decomposition of A @ A.T (A A^T)
AAT = A @ A.T # AAT is 2x2
print("\nMatrix A A^T:\n", AAT)
eigenvalues_AAT, eigenvectors_AAT = np.linalg.eigh(AAT)
# Reverse sort order for consistency
eigenvalues_AAT = eigenvalues_AAT[::-1]
eigenvectors_AAT = eigenvectors_AAT[:, ::-1]
print("\nEigen-decomposition of A A^T:")
print("Eigenvalues (should match Sigma^2):", eigenvalues_AAT)
print("Eigenvectors (should match columns of U):\n", eigenvectors_AAT)
# Verification Checks (allowing for small numerical differences)
print("\nVerification:")
# Check eigenvalues
print("Sigma^2 matches eigenvalues of A^T A (approx):", np.allclose(Sigma_squared, eigenvalues_ATA[:len(s)]))
print("Sigma^2 matches eigenvalues of A A^T (approx):", np.allclose(Sigma_squared, eigenvalues_AAT[:len(s)]))
# Check eigenvectors (Note: Eigenvectors are unique only up to sign)
# Check if columns of V match eigenvectors_ATA (or their negative)
print("Columns of V match eigenvectors of A^T A (approx):",
np.allclose(np.abs(V[:, :len(s)]), np.abs(eigenvectors_ATA[:, :len(s)])))
# Check if columns of U match eigenvectors_AAT (or their negative)
print("Columns of U match eigenvectors of A A^T (approx):",
np.allclose(np.abs(U), np.abs(eigenvectors_AAT)))
Running this code demonstrates the connection:
Remember that eigenvectors (and singular vectors) are unique only up to a sign (). So, when comparing vectors like and the eigenvectors of , you might find that some corresponding columns are negatives of each other. This is perfectly normal and np.allclose comparing absolute values handles this.
This deep connection between SVD and eigen-decomposition solidifies SVD as a fundamental tool. It uses the well-understood properties of symmetric matrix eigen-decomposition to provide a powerful decomposition for any matrix, with direct applications in understanding data structure, dimensionality reduction, and more.
Cleaner syntax. Built-in debugging. Production-ready from day one.
Built for the AI systems behind ApX Machine Learning
Was this section helpful?
© 2026 ApX Machine LearningEngineered with