So far, we have treated matrices as arrays of numbers used in operations like multiplication. We now shift perspective to see matrices as operators that perform linear transformations. When a matrix multiplies a vector, it can stretch, shrink, or rotate it, mapping it to a new position in space.
This raises a question: for a given transformation, are there any vectors whose direction is unchanged? The answer is found in the study of eigenvalues and eigenvectors. An eigenvector is a vector that is only scaled by a transformation, not pointed in a new direction. The corresponding eigenvalue is the scalar factor of that scaling. This relationship is compactly expressed by the equation:
Av=λv
Here, A is the transformation matrix, v is an eigenvector, and λ (lambda) is its corresponding eigenvalue.
Throughout this chapter, we will cover the following topics:
5.1 Matrices as Linear Transformations
5.2 Defining Eigenvalues and Eigenvectors
5.3 Geometric Interpretation
5.4 The Characteristic Equation
5.5 Hands-On Practical: Finding Eigenvalues with NumPy
© 2026 ApX Machine LearningEngineered with