In the previous lecture, we saw that for a subset (where is a vector space) that the span of () is a subspace of and so is a vector space. We also solved the question of whether a particular vector belongs to by answering the question of whether we can write as a linear combinations of the elements of . When contains a finite collection of elements , we did this by writing as linear combination of the elements of as follows,
Writing as a linear combinations above requires solving the above system of linear equations with variables . If there is a solution to the system, then we know can be written as a linear combination of the elements of and it belongs to the span of . If the system doesn’t have a solution, then this means that can’t be written as a linear combination of the elements and so doesn’t belong to the span of .
But this question can be reversed meaning we can answer this question in terms of the span of a matrix. So if we’re given a linear system of equations with matrix ,
If we view this matrix as a collection of column vectors in instead. The span of these vectors is a subspace of . We call this subspace the Column Space of .
Definition
Given a vector
, we define the following operation
This product is in the column space of because it is a linear combinations of the columns of .
Note that this product or operations works if the vector has as many entries as the columns of .
Example 1
The following matrix has three column vectors in . The product below is a linear combination of the columns of the matrix.
An Observation
Given and viewing the as a variable, the equation
is equivalent to the linear system with augmented matrix .
Test
Let’s verify the above observation. Let
Then,
This is a linear system of equations with augmented matrix
is consistent if and only if . This again means that answering the question of whether is in the span of columns of is the same as answering if the system has a solution.
Linear Dependence and Linear Independence
Suppose is a subspace of . We know that spans are subspaces but is a span of some elements? or what is the smallest number such that can be written as
The answer to this question could be that there is no such (for example if is a subspace of the vector space of all continuous functions).
Definition
A subset is linearly dependent if there are scalars not all zero such that
Note here that not linearly dependent is equivalent to linearly independent. In particular the set is linearly independent if and only if
is true only when .
Example 0
Consider . This set is linearly dependent because we can choose a scalar such that .
Example 1
Given a vector such that , then is linearly independent.
Proof: We need to prove that only if . Now suppose that . We have two cases. Case one is when and we’re done. Case two is when . If , then
But we know that . Therefore must be zero and we’re done.
Example 2
Given a vector , prove that is linearly dependent if and only if one vector is a scalar multiple of the other.
Proof: For the forward direction, suppose that is linearly dependent. This means that there are scalars (a_1, a_2) not all zero such that
Without the loss of generality, suppose that . Then,
For the backward direction, suppose that one vector can be written as a scalar multiple of the other. Without the loss of generality, suppose it is . Then where , We can re-write this as follows,
From this we see that there are scalars and , not all zero such that which means that they are linearly dependent as we wanted to show.
Example 3
Determine the set of the following vectors:
is linearly dependent.
The set of vectors is linearly independent implies that and . So,
This is equivalent to solving the following system:
is linearly independent if and only if this system has the trivial solution . will always be a solution. It’s why it is called the trivial solution. So the question now is how do we know from the REF matrix, whether we have the solution or a non-zero solution?
If the matrix has no column without a leading entry (besides last) then we have a unique solution and the set is linearly independent. If the matrix has a column (besides the last) with no leading entry then we have infinitely many solutions (so not just the zero vector) and the set is linearly dependent. So we want to make sure that all columns have leading entries.
This will eventually be
The third column has no leading entry so there are infinitely many solutions besides the zero solution this set is linearly dependent.
Example 4
Consider . Is this set linearly dependent or independant?
From the definition if was a non-zero scalar multiple of , in other words for some non-zero scalar , then they are linearly dependent. But two functions are equal when they take the same value at every point.
Suppose we choose the point , then we know that but . Therefore, these functions are not equal and so the set is linearly independent.
Theorem
If is linearly dependent, then there exists one can be expressed as a linear combination of the others.
Proof:
Suppose the set is linearly dependent. This means that for some (not all zero) scalars , we must have
Without the loss of generality that for some . Multiply the equation by
Therefore, is a linear combination of the other vectors as we wanted to show.
Theorem (Refinement Theorem)
Suppose is linearly dependent. There is a subset which is linearly independent and satisfies
Proof:
Suppose the set is linearly dependent. By the previous theorem, there exists some where is a linear combination of the other vectors in the set. So,
Now, delete from this set. We claim that
(the hat symbol means that the variable has been deleted). To show this, we will show that the two sets are subsets of each others which will imply that the two sets are equal.
Now, given a vector in , we want to show that . But since we established that is a linear combination of the other vectors so substitute in as follows,
From this we see that given a vector in , it is also in . For the other direction, it’s trivial. If we have a vector , then it is in . Therefore, the two spans are equal.
But now, if is linearly independent, we then stop. Otherwise we find another to throw out. This process will stop since we started with a finite number of vectors.
Remark: To find , we start with and ask/settle the question (Q1) can you as linear combinations of . If the answer is yes set . If the answer is No, then ask (Q2) can be written as a linear combination of ?
Conclusion: we can refine a finite subset to obtain a linearly independent subset with the same span.
The rest of this lecture covered the definition of what a basis is and some other small result. I decided to move these to lecture 9 since lecture 9 covered basis in depth.
References: