Lecture 15: Invertible Linear Transformations and Isomorphisms
Identity matrix and Kronecker delta
\(I_n\) is the \(n \times n\) identity matrix. To construct the matrix, we have the following rule.
\(\delta_{ij}\) is the Kronecker delta. For any matrix \(A \in M_{m \times n}\), \(AI_n = A\) and \(I_mA = A\).
Exercise: Suppose we have a finite basis \(\beta\) for \(V\), then the identity map is
If we compute its matrix representative, we will see that \([I_V]_{\beta}^{\beta} = I_n\).
Inverse Linear Transformation
Next we define the inverse of a linear transformation
Example 1
has an inverse
We can check that this is true by composing these transformations and check that we get the identity map.
Example 2
Suppose we have the following vector spaces:
\(\hat{P}\) is the set of all polynomials without the constant term \(a_0\). We claim that \(\hat{P}\) is a vector space. The easiest way to verify this to show that \(\hat{P}\) is a subspace of \(P\). Now consider the following linear map between \(P\) to \(\hat{P}\) where we multiply the polynomial by \(x\).
This map has the following inverse linear transformation:
Example 3
The following linear transformation
has no inverse! (it is onto but not 1-1). Not one-to-one since the derivative of any constant function is 0.
This leads us to the following theorem:
- \(T\) has an inverse if and only if \(T\) is 1-1 (\(N(T)=\{\bar{0}_V\}\)) and onto (\(R(T) = W\)).
- If \(T\) is invertible, then its inverse is unique \((T^{-1})\).
- \(T^{-1}\) is linear. (Theorem 2.17 in the book)
The third property is not obvious and requires a proof.
Proof: Let \(T: V \rightarrow W\) be a linear map with inverse \(T^{-1}: W \rightarrow V\). We want to show that \(T^{-1}\) is linear. To do this, we need to show that
Because \(T\) has an inverse then \(T\) is onto. This means that \(w_1 = T(v_1)\) and \(w_2 = T(v_2)\) for some \(v_1, v_2 \in V\). Moreover, since \(T\) is 1-1, then \(v_1 \neq v_2\). Now,
Therefore, \(T^{-1}\) is linear. \(\blacksquare\)
Proof (in the case where \(V\) and \(W\) are finite dimensional spaces):
Let \(T: V \rightarrow W\) be a linear and invertible map and suppose that \(\dim(V)=n\). Choose a basis \(\beta = \{v_1, ..., v_n\}\) for \(V\). Consider the set of images,
We need to show that this set is a basis which means that we need to show that it spans \(W\) and so \(Span(T(\beta)) = W\) and that it is a linearly independent set. To show that it spans \(W\), we need for any \(w \in W\), scalars \(a_1,...,a_n\) such that
But we know that \(w = T(v)\) for some \(v \in V\) since \(T\) is onto. We also know that \(\beta\) is a basis for \(V\) and so we can write \(v\) as a linear combination of the vectors in \(\beta\) for some scalars \(a_1,...,a_n\).
And now because \(T\) is linear, we can do the following
Which is what we wanted to show. To show that the vectors in \(T(\beta)\) are linearly independent, we need to show that the solution to
is the trivial solution which means that \(a_1=0,..,a_n=0\). We can use the linearity of \(T\) to show that
But \(T\) is 1-1 and so this implies that \(a_1v_1 + ... + a_nv_n\) must be \(\bar{0}_V\). Therefore, we must have \(a_1=0,...,a_n=0\) as required.
(Study Notes:) This is really important. \(T\) is invertible and \(V\) having dimension \(n\) means that \(W\) has dimension \(n\). Also \(T\) is one-to-one and onto. Later we’ll learn if both \(V\) and \(W\) have the same dimension and \(T\) is one-to-one, then this is sufficient to conclude that \(T\) is invertible. (next lecture).
I also went through the proof in the book for this corollary. See This
Isomorphism
Example 4
\(\mathbf{R}^3\) and \(P_2\) are isomorphic. To see this, we need an invertible map from one to the other. The maps
and
are both isomorphisms.
Example 5
is an isomorphism and so \(P\) and \(\hat{P}\) are isomorphic.
Criterion for Isomorphic Finite Dimensional Vector Spaces
Proof:
\(\Rightarrow\): If \(W\) is isomorphic to \(T\), then there exists an isomorphism map \(T\). \(T\) is onto and one-to-one because it is invertible. Therefore, \(\dim(W) = \dim(V)\) by the corollary we stated earlier.
\(\Leftarrow\): Suppose that \(\dim V = \dim W\). We want to show that they are isomorphic. This means that there exists some invertible map from one to the other. So let \(\beta = \{v_1,...,v_n\}\) be a basis for \(V\) and \(\alpha = \{w_1, ...,w_n\}\) be a basis for \(W\).
Define the map \(T: W \rightarrow V\) by \([T]_{\alpha}^{\beta} = I_n\). This \(T\) works. Why? This \(T\) satisfies:
More generally,
References
- Math416 by Ely Kerman