Certain subspaces of a linear transformation remain mostly unaffected. For the linear transformation which maps a vector space onto itself, some subspaces of , such as can be invariant, where .
The vectors that are in , which only scale with a linear transformation are called eigenvectors.
Definition
Eigenvectors & Eigenvalues (Linear Transformation) ^formula1
Let be a linear transformation mapping the vector space (fixed under field ) onto itself. Then an eigenvector of is given by:
is an eigenvector, with it’s corresponding eigenvalue being .
Since every transformation can be represented as a matrix, we can solve:
This leads to a matrix definition of eigenvectors:
Eigenvectors & Eigenvalues (Matrices) ^formula2
Let be an matrix with entries in (or ). A scalar, , is an eigenvalue if there exists a non-zero column matrix, such that:
Then is an eigenvector of the matrix
Obtaining eigenvectors
Obtaining the Eigenvectors from a Matrix
Linear Independence
Linear Independence of Eigenvectors
If a collection of eigenvectors uniquely map to a set of eigenvalues, then these eigenvectors are linearly independent. I.e. if a set of distinct eigenvectors* corresponds to the set of distinct eigenvalues* such that:
Then the set of eigenvectors is linearly independant.