Note: I’m not convinced with the argument for (a) since \(S\) could be infinite. The book’s official solution doesn’t seem to handle this case?

1.6 Exercise 20
Let \(V\) be a vector space having dimension \(n\), and let \(S\) be a subset of \(V\) that generates \(V\).
  1. Prove that there is a subset of \(S\) that is a basis for \(V\). (Be careful not to assume that \(S\) is finite.)
  2. Prove \(S\) contains at least \(n\) vectors.


Proof:

  1. Similar to the proof of theorem 1.9, we know that \(S\) generates \(V\). If n = 0, then \(V=\{0\}\) and \(\emptyset\) is a subset of \(S\) and a basis for \(V\). Otherwise, \(n \geq 0\) and there is at least one non-zero vector. Let this vector be \(u\). Let \(\beta = \{u\}\). We know that \(\beta\) is linearly independent. Continue if possible while \(\beta\) remains linearly independent. This process will stop at some point since we know that \(V\) is of dimension \(n\). (will it stop??? \(S\) can have infinitely many linearly dependent vectors ugh). After we stop then we know that \(\beta\) is a linearly independent set. We just need to prove that it generates \(V\) by showing that \(S \in span(\beta)\). Let \(v \in S\), if \(v \in \beta\), then we're done. Otherwise we know that \(v \not\in \beta\). This however means that \(\beta \cup v\) is linearly dependent by construction. This means that \(v\) can be written as a linear combination of the elements of \(\beta\) which means that \(v \in span(\beta)\) as we wanted to show. \(\blacksquare\)
  2. If \(S\) is infinite then it contains more than \(n\) vectors. If \(S\) is finite, then we know \(S\) contains at least \(n\) vectors by Corollary 2 from theorem 1.9. \(\blacksquare\)


The book provided the solution here. though here is the thing, they’re using theorem 1.10 (replacement theorem). The replacement theorem assumes that \(S\) is finite? doesn’t it??

References: