Theorem 6.5
Let \(V\) be a non-zero finite-dimensional inner product space. Then \(V\) has an orthonormal basis \(\beta\). Furthermore, if \(\beta = \{v_1, v_2, ..., v_n\}\) and \(x \in V\), then $$ \begin{align*} x = \sum_{i=1}^n \langle x, v_i \rangle v_i \end{align*} $$


Proof:

Suppose we have a basis \(\beta_0\) for \(V\). We can apply Gram Schmidt to obtain an orthogonal set \(\beta'\) of nonzero vectors with \(\text{span}(\beta')=\text{span}(\beta_0) = V\). We can then normalize each vector to obtain an orthonormal set \(\beta\) that generates \(V\). By Corollary 2 of Theorem 6.3 (If \(S\) is an orthogonal set with non-zero vectors, then \(S\) is linearly independent), \(\beta\) is linearly independent. Therefore \(\beta\) is an orthonormal basis for \(V\). The rest follows from Corollary 1 of Theorem 6.3.\(\ \blacksquare\)

Theorem 6.5 (Corollary)
Let \(V\) be a finite-dimensional inner product space with an orthonormal basis \(\beta = \{v_1,v_2,...,v_n\}\). Let \(T\) be a linear operator on \(V\) and let \(A = [T]_{\beta}\). Then for any \(i\) and \(j\), \(A_{ij}= \langle T(v_j), v_i \rangle\).


Proof From Theorem 6.5, we have

$$ \begin{align*} T(v_j) = \sum_{i=1}^n \langle T(v_j), v_i \rangle v_i \end{align*} $$

Therefore, \(A_{ij} = \langle T(v_j), v_i \rangle\). \(\ \blacksquare\)



References