Page 1 of 1 [ 2 posts ]
 Print view Previous topic | Next topic
Exercise [13.31]
Author Message

Joined: 22 Apr 2010, 15:52
Posts: 43
Location: Olpe, Germany
Exercise [13.31]
PREREQUISITE:
Each eigenvector of T having multiplicity (if there are any) spans an eigenspace of dimension d = r.

ASSERTION:
Then it is possible to find a basis of eigenvectors.

PROOF:

Because of the prerequisite, it is possible to find a set of n eigenvectors. What remains to be shown is that these n eigenvectors constitute a basis. According to Exercise [13.28], this is tantamount to the linear independence of the .

To show this, a particular numbering of the eigenvectors is first introduced, in which any of the () eigenvalues is associated with the eigenvectors . The indices shall cover the whole index range from 1 to n.

The linear independence of the eigenvectors is now proven indirectly by assuming the opposite:

ASSUMPTION: There are coefficients which do not all vanish such that:

............................................(1)

The last line uses the definition . Note that the are eigenvectors with .

Applying T to eq.(1) yields:

............(2)

It can be assumed without loss of generality that all eigenvalues besides possibly are nonzero (renumber the eigenvalues if another one should be zero). Eq.(2) can then be resolved for :

......................(3)

With eq.(3), it is possible to eliminate vector from eq.(1), yielding:

..........(1')

The last line uses the definition . Note that all are nonzero (because all eigenvalues are different from each other) and that the are eigenvectors with .

The above reasoning starting at eq.(1) can now be repeated with eq.(1') to eliminate vector , and so on, yielding the following sequence of equations:

..........(1'')
...
....................(1''')
..............................(1'''')

As all coefficients are nonzero, it follows successively
that (from eq.(1'''')),
then that (the above and eq.(1''')),
etc. until .
For short, all .

For all eigenvalues of multiplicity 1 we have (cf. definition of with ). The above finding hence implies for these eigenvalues.

Nonzero coefficients can hence only appear in s belonging to eigenvalues of multiplicity . However, if would be such a vector, this and the above finding that are in contradiction to the prerequisite that all corresponding eigenvectors are linearly independent.

Thus we have the desired contradiction which proves that the ASSUMPTION was wrong.

18 May 2010, 15:51

Joined: 12 Jul 2010, 07:44
Posts: 154
Re: Exercise [13.31]
Here's an alternative proof:

Construct any set of n eigenvectors of T such that one eigenvector is chosen for each of the n eigenvalues (counting multiple eigenvalues more than once), ensuring that the subset of eigenvectors chosen for each multiple eigenvalue is linearly independent. This is always possible given the assumption about T stated in the text.

Now suppose (1)

If there are any which are non-zero in this equation, we can re-write it including only the non-zero terms (and re-numbering), to get:

where no is zero, and where

Then pre-multiplying both sides by T n times (for any yeilds:

Without loss of generality, assume the numbering of the 's and 's is chosen so that , for all i>1. Then dividing by yeilds:

Now as this is true for all n in {0,1,2,...}, it is true in the limit as n approaches infinity.
(If there's a problem with the validity of this step, please let me know!)

But in this limit, all terms j for which vanish, leaving only the terms k for which (k>1).

There are then three cases to consider.

Case 1: There are no such 's. Then we are left with . This is a contradiction because eigenvectors cannot be zero.

Case 2: There are only 's equal to . Then the are all equal to 1, and the resulting equation is:

where all the 's are eigenvectors of the (multiple) eigenvalue . This is a contradiction because we assumed we picked our eigenvectors so that the subset of eigenvectors chosen for any multiple eigenvalue would be linearly independent.

Case 3: There is at least one term r for which . For these terms, is 1 when n is even and -1 when n is odd. Considering for only odd n, and also for only even n, we have:

But clearly this equation can only hold true when the term equates to zero, and so after crossing it out, this case reduces to either case 1 or case 2; or else we again have a contradiction.

Thus in each case there is a contradiction.

Therefore, there can be no 's which are non-zero in equation (1).
Therefore, our chosen are linearly independent, and are thus a basis for the n-dimensional vector space V on which T operates.

QED

31 Jul 2010, 17:54
 Page 1 of 1 [ 2 posts ]