Spectral Theorem

by ayushkhaitan3437

This post is on the Spectral Theorem. This is something that I should have been clear on a long time ago, but for reasons unknown to me, I was not. I hope to be able to rectify that now. The proof was discussed today in class. I am only recording my thoughts on it.

The spectral theorem states that a self-adjoint operator in an n dimensional vector space has n orthogonal eigenvectors, and all its n eigenvalues are real.

Let V be the n-dimensional vector space under consideration, and let a,b\in V. A self adjoint operator is one that satisfies the following condition: \langle Ta,b\rangle=\langle a,Tb\rangle. If the inner product is defined in the conventional way in this setting, which is a sesquilinear product, then T has to be a hermitian matrix. For motivation, we’re going to assume this inner product for the rest of the proof.

As we’re working in \Bbb{C}, T has at least one eigenvalue, and consequently and eigenvector. Let Tv=\lambda v, where \lambda is the eigenvalue and v is the eigenvector. \langle Tv,v\rangle=\langle \lambda v,v\rangle=\lambda\langle v,v\rangle=\langle v,Tv\rangle=\langle v,\lambda v\rangle=\overline{\lambda}\langle v,v\rangle. We know that \langle v,v\rangle\in\Bbb{R} (it is in fact greater than $0$). Hence, \lambda=\overline{\lambda}, which shows that \lambda is real valued.

How do we contruct the basis of orthogonal eigenvectors though? We start with one eigenvector v. Now consider the orthogonal complement of v. Let this be A. We claim that T(A)\subset A. This is because for a\in A, \langle Ta,v\rangle=\langle a,Tv\rangle=\langle a,\lambda v\rangle=\overline{\lambda}\langle a,v\rangle=\lambda\langle a,v\rangle=0 (remember that \lambda=\overline{\lambda}). Hence, if we write T in terms of the new basis which has v and elements from its orthogonal complement, then the first row and column will be all 0‘s except for the the top left position, which will have \lambda.

Now the action of T on the orthogonal complement, or A, is the same as the action of T\setminus \{\text{first row and first column}\}. The determinant of this again will have at least one solution, which ensures that we have an eigenvalue to work with. In this way, through an iterative process which ends after n iterations, we can generate n eigenvectors.

Advertisements