While understanding the characteristic equation det(A−λI)=0 and solving the system (A−λI)x=0 gives us the eigenvalues λ and eigenvectors x, performing these calculations manually becomes impractical for matrices larger than 2x2 or 3x3. In machine learning contexts, we often deal with high-dimensional data, resulting in large matrices. Fortunately, numerical libraries like NumPy provide efficient and robust functions for these computations.
The primary tool in NumPy for eigenvalue and eigenvector calculation is the numpy.linalg.eig
function. This function takes a square matrix as input and returns its eigenvalues and corresponding right eigenvectors.
numpy.linalg.eig
Let's see how to use np.linalg.eig
. It takes a single argument: the square matrix A
for which we want to find eigenvalues and eigenvectors. It returns a tuple containing two NumPy arrays:
eigenvalues
: A 1D NumPy array containing the eigenvalues (λ) of the matrix.eigenvectors
: A 2D NumPy array where each column represents the normalized eigenvector corresponding to the eigenvalue at the same index in the eigenvalues
array.Consider the following 2x2 matrix A:
A=[4123]We can find its eigenvalues and eigenvectors using NumPy like this:
import numpy as np
# Define the matrix A
A = np.array([[4, 2],
[1, 3]])
# Calculate eigenvalues and eigenvectors
eigenvalues, eigenvectors = np.linalg.eig(A)
print("Matrix A:")
print(A)
print("\nEigenvalues (λ):")
print(eigenvalues)
print("\nEigenvectors (columns of the matrix below):")
print(eigenvectors)
Running this code will output:
Matrix A:
[[4 2]
[1 3]]
Eigenvalues (λ):
[5. 2.]
Eigenvectors (columns of the matrix below):
[[ 0.89442719 -0.70710678]
[ 0.4472136 -0.70710678]]
So, the eigenvalues are λ1=5 and λ2=2.
The corresponding eigenvectors are the columns of the eigenvectors
array:
Notice that NumPy returns normalized eigenvectors, meaning their L2 norm (length) is 1.
We can verify these results by checking if the fundamental relationship Ax=λx holds for each pair. Let's check for the first eigenvalue and eigenvector pair (λ1=5, x1).
# Extract the first eigenvalue and eigenvector
lambda1 = eigenvalues[0]
x1 = eigenvectors[:, 0] # First column
# Calculate Ax
Ax1 = A @ x1 # Using the matrix multiplication operator @
# Calculate λx
lambda1_x1 = lambda1 * x1
print("\nVerifying for λ =", lambda1)
print("A * x1:")
print(Ax1)
print("λ1 * x1:")
print(lambda1_x1)
# Check if they are close (due to potential floating-point errors)
print("Are Ax and λx close?", np.allclose(Ax1, lambda1_x1))
The output should confirm that Ax1 and λ1x1 are indeed very close, validating the computation:
Verifying for λ = 5.0
A * x1:
[4.47213595 2.23606798]
λ1 * x1:
[4.47213595 2.23606798]
Are Ax and λx close? True
You can perform a similar check for the second pair (λ2=2, x2).
Recall the eigen-decomposition A=PDP−1, where P is the matrix whose columns are the eigenvectors, and D is the diagonal matrix with eigenvalues on the diagonal. The eigenvectors
array returned by np.linalg.eig
is precisely the matrix P. The eigenvalues
array gives us the diagonal elements for D.
We can verify the decomposition property, often more conveniently checked as AP=PD.
# Construct the diagonal matrix D
D = np.diag(eigenvalues)
# Get the matrix P of eigenvectors
P = eigenvectors
# Calculate AP
AP = A @ P
# Calculate PD
PD = P @ D
print("\nVerifying Eigen-decomposition (AP = PD)")
print("AP:")
print(AP)
print("\nPD:")
print(PD)
print("\nAre AP and PD close?", np.allclose(AP, PD))
This verification should also return True
, confirming that the outputs from np.linalg.eig
are consistent with the eigen-decomposition formula.
Verifying Eigen-decomposition (AP = PD)
AP:
[[ 4.47213595 -1.41421356]
[ 2.23606798 -1.41421356]]
PD:
[[ 4.47213595 -1.41421356]
[ 2.23606798 -1.41421356]]
Are AP and PD close? True
numpy.linalg.eigh
. This function is optimized for symmetric (or Hermitian) matrices and guarantees real eigenvalues and orthogonal eigenvectors.numpy.linalg.eig
handles this correctly, returning complex arrays if necessary.Using NumPy's linalg.eig
function abstracts away the complexities of solving the characteristic polynomial and the resulting linear systems, providing a powerful tool for applying eigenvalue concepts in practical machine learning tasks like Principal Component Analysis (PCA), which relies heavily on the eigen-decomposition of covariance matrices.
© 2025 ApX Machine Learning