San José State University
|
---|
applet-magic.com Thayer Watkins Silicon Valley & Tornado Alley U.S.A. |
---|
The Linear Independence of the Eigenspaces of a Matrix |
---|
Let M be an n×n matrix of complex elements. A complex number λ and vector X, other than the zero vector 0, such that
are called, repectively, an eigenvalue and eigenvector of M. The set of all eigenvectors associated with λ is called the eigenspace associated with that eigenvalue. Let the eigenspace of λ be denoted as Λλ. Let Vλ be the null space of the matrix (M-λI), where I is the n×n identity matrix. Vλ is a vector subspace of the n dimensional vector space. Vλ is just the eigenspace of λ with the zero vector 0 adjoined; i.e.,
The dimension of Vλ is equal to the multiplicity of λ as a root of the n-th degree polynomial equation
Let {λj; j=1, …,m} be the distinct eigenvalues of M with {Λλj; j=1, …,m} as their associated eigenspaces. Then {Λλj; j=1, …,m} are linearly independent.
Proof:
Let Xj be any element from Λλj for j=1, …,m. Suppose {Xj; j=1, …,m} are linear dependent. This would mean that there exists coefficients {cj; j=1, …,m}, not all zero, such that
where the summation is from j=1 to m. Some of the coefficients, but not all, might be zero. Let q be the smallest number of linearly dependent elements of the set {Xj; j=1, …,m} and let the vectors be numbered such that the linearly dependent vectors are numbered 1 through q. Then
Now multiply this equation through by M. The result is
Now multiply the prior equation by any of the λ's, say λ1 to obtain
This equation may be subtracted from the previous equation to obtain
where now the summation runs from j=2 to q, since the term for j=1 cancels out.
None of the terms cj(λj−λ1) are zero so this means there is a contradiction of the assumption that the set {Xj; j=1, …,q} had the least number of dependent variables. Therefore there is no set of linearly dependent vectors among the vectors chosen from the distinct eigenspaces.
HOME PAGE OF applet-magic |
---|