Section 5.2: Theorem 5.5
Proof:
By induction on \(k\).
Base Case: If \(k = 1\), then \(S_1\) is linearly independent and we are done.
Inductive Case: Suppose the hypothesis is true for \(k - 1\) and suppose now we have \(k\) distinct eigenvalues \(\lambda_1,...,\lambda_k\). For each \(i=1,2,...,k\), let \(S_i = \{v_{i1}, v_{i2}, ..., v_{in}\}\) be a linearly independent set of eigenvectors corresponding to \(\lambda_i\). We will show that \(S = S_1 \cup ... \cup S_k\) is linearly independent.
Consider any scalars \(\{a_{ij}\}\) where \(i = 1,2,...,k\) and \(j = 1,2,...,n\) such that
We want to show that the only solution to the above equation is trivial solution. Apply \(T - \lambda_kI_n\) to both sides of the equation to see that
But now we reduced the problem to \(k-1\) eigenvalues and their corresponding eigenvectors and \(S_1 \cup...\cup S_{k-1}\) is linearly independent by the induction hypothesis. This means that the scalars \(a_{ij}(\lambda_i-\lambda_k)\) in the above sum must be 0. But we’re also given that \(\lambda_1...\lambda_k\) are distinct so \(\lambda_i - \lambda_k \neq 0\) for any \(i=1,...,k-1\). Therefore we must have \(a_{ij}\) specifically be zero for all \(i=1,...,k-1\) and \(j=1,...,n\). Observe now that in the original equation we have
The scalars \(a_{ij}\) are 0 for all \(i = 1,...,{k-1}\). We also know that \(S_k\) is linearly independent. Therefore, \(a_{jk}\) are all 0. So \(S\) is linearly independent as we wanted to show. \(\blacksquare\).
I really don’t like this proof!
Proof:
Suppose that \(T\) has distinct eigenvalues \(\lambda_1, ..., \lambda_n\). For each \(i\), choose an eigenvector \(v_i\) corresponding to \(\lambda_i\). By theorem 5.5, we know that \(\{v_1,...,v_n\}\) must be linearly independent and since \(\dim(V)=n\), then this set must be a basis for \(V\). Since we have a basis of eigenvectors then by theorem 5.1, \(T\) must be diagonalizable. \(\ blacksquare\)