Theorem 5.5
Let \(T\) be a linear operator on a vector space, and let \(\lambda_1,...,\lambda_k\) be distinct eigenvalues of \(T\). For each \(i=1,2...,k\), let \(S_i\) be a finite set of eigenvectors of \(T\) corresponding to \(\lambda_i\). If each \(S_i\) is linearly independent, then \(S_1 \cup ...\cup S_k\) is linearly independent.


Proof:

By induction on \(k\).

Base Case: If \(k = 1\), then \(S_1\) is linearly independent and we are done.

Inductive Case: Suppose the hypothesis is true for \(k - 1\) and suppose now we have \(k\) distinct eigenvalues \(\lambda_1,...,\lambda_k\). For each \(i=1,2,...,k\), let \(S_i = \{v_{i1}, v_{i2}, ..., v_{in}\}\) be a linearly independent set of eigenvectors corresponding to \(\lambda_i\). We will show that \(S = S_1 \cup ... \cup S_k\) is linearly independent.

Consider any scalars \(\{a_{ij}\}\) where \(i = 1,2,...,k\) and \(j = 1,2,...,n\) such that

$$ \begin{align} \sum_{i=1}^k \sum_{j=1}^{n_i} a_{ij}v_{ij} = 0 \end{align} $$

We want to show that the only solution to the above equation is trivial solution. Apply \(T - \lambda_kI_n\) to both sides of the equation to see that

$$ \begin{align*} (T-\lambda_kI)(\sum_{i=1}^k \sum_{j=1}^{n_i} a_{ij}v_{ij}) &= (T - \lambda_kI_n) 0 \\ \sum_{i=1}^k \sum_{j=1}^{n_i} T-\lambda_kI(a_{ij}v_{ij}) &= 0 \quad \text{ (by the linearity of $T$)}\\ \sum_{i=1}^k \sum_{j=1}^{n_i} T(a_{ij}v_{ij}) - \lambda_k a_{ij} v_{ij} &= 0 \\ \sum_{i=1}^k \sum_{j=1}^{n_i} \lambda_i a_{ij}v_{ij} - \lambda_k a_{ij} v_{ij} &= 0 \quad \text{ ($v_{ij}$ is an eigenvector)}\\ \sum_{i=1}^k \sum_{j=1}^{n_i} a_{ij} (\lambda_i - \lambda_k) v_{ij} &= 0\\ \sum_{j=1}^{n_k} a_{kj} (\lambda_k - \lambda_k) v_{kj} + \sum_{i=1}^{k-1} \sum_{j=1}^{n_i} a_{ij} (\lambda_i - \lambda_k) v_{ij} &= 0\\ \sum_{i=1}^{k-1} \sum_{j=1}^{n_i} a_{ij} (\lambda_i - \lambda_k) v_{ij} &= 0\\ \end{align*} $$

But now we reduced the problem to \(k-1\) eigenvalues and their corresponding eigenvectors and \(S_1 \cup...\cup S_{k-1}\) is linearly independent by the induction hypothesis. This means that the scalars \(a_{ij}(\lambda_i-\lambda_k)\) in the above sum must be 0. But we’re also given that \(\lambda_1...\lambda_k\) are distinct so \(\lambda_i - \lambda_k \neq 0\) for any \(i=1,...,k-1\). Therefore we must have \(a_{ij}\) specifically be zero for all \(i=1,...,k-1\) and \(j=1,...,n\). Observe now that in the original equation we have

$$ \begin{align} \sum_{i=1}^k \sum_{j=1}^{n_i} a_{ij}v_{ij} &= 0 \end{align} $$

The scalars \(a_{ij}\) are 0 for all \(i = 1,...,{k-1}\). We also know that \(S_k\) is linearly independent. Therefore, \(a_{jk}\) are all 0. So \(S\) is linearly independent as we wanted to show. \(\blacksquare\).

I really don’t like this proof!



Theorem 5.5 Corollary
Let \(T\) be a linear operator on an \(n\)-dimensional vector space \(V\). If \(T\) has \(n\) distinct eigenvalues, then \(T\) is diagonalizable.


Proof:

Suppose that \(T\) has distinct eigenvalues \(\lambda_1, ..., \lambda_n\). For each \(i\), choose an eigenvector \(v_i\) corresponding to \(\lambda_i\). By theorem 5.5, we know that \(\{v_1,...,v_n\}\) must be linearly independent and since \(\dim(V)=n\), then this set must be a basis for \(V\). Since we have a basis of eigenvectors then by theorem 5.1, \(T\) must be diagonalizable. \(\ blacksquare\)



References: