Section 2.4 Linear independence
¶In the previous section, we studied our question concerning the existence of solutions to a linear system by inquiring about the span of a set of vectors. In particular, the span of a set of vectors \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is the set of vectors \(\bvec\) for which a solution to the linear system \(\left[\begin{array}{rrrr}
\vvec_1\amp\vvec_2\amp\ldots\amp\vvec_n
\end{array}\right]
\xvec
=
\bvec\) exists.
In this section, our focus turns to the uniqueness of solutions of a linear system, the second of our two fundamental questions asked in Question 1.4.2. This will lead us to the concept of linear independence.
Subsection 2.4.1 Linear dependence
In the previous section, we looked at some examples of the span of sets of vectors in \(\real^3\text{.}\) We saw one example in which the span of three vectors formed a plane. In another, the span of three vectors formed \(\real^3\text{.}\) We would like to understand the difference in these two examples.
Preview Activity 2.4.1
Let's start this activity by studying the span of two different sets of vectors.

Consider the following vectors in \(\real^3\text{:}\)
\begin{equation*}
\vvec_1=\threevec{0}{1}{2},
\vvec_2=\threevec{3}{1}{1},
\vvec_3=\threevec{2}{0}{1}\text{.}
\end{equation*}
Describe the span of these vectors, \(\span{\vvec_1,\vvec_2,\vvec_3}\text{.}\)

We will now consider a set of vectors that looks very much like the first set:
\begin{equation*}
\wvec_1=\threevec{0}{1}{2},
\wvec_2=\threevec{3}{1}{1},
\wvec_3=\threevec{3}{0}{1}\text{.}
\end{equation*}
Describe the span of these vectors, \(\span{\wvec_1,\wvec_2,\wvec_3}\text{.}\)

Show that the vector \(\wvec_3\) is a linear combination of \(\wvec_1\) and \(\wvec_2\) by finding weights such that
\begin{equation*}
\wvec_3 = a\wvec_1 + b\wvec_2\text{.}
\end{equation*}

Explain why any linear combination of \(\wvec_1\text{,}\) \(\wvec_2\text{,}\) and \(\wvec_3\text{,}\)
\begin{equation*}
c_1\wvec_1 + c_2\wvec_2 + c_3\wvec_3
\end{equation*}
can be written as a linear combination of \(\wvec_1\) and \(\wvec_2\text{.}\)

Explain why
\begin{equation*}
\span{\wvec_1,\wvec_2,\wvec_3} = \span{\wvec_1,\wvec_2}\text{.}
\end{equation*}
The preview activity presents us with two similar examples that demonstrate quite different behaviors. In the first example, the vectors \(\vvec_1\text{,}\) \(\vvec_2\text{,}\) and \(\vvec_3\) span \(\real^3\text{,}\) which we recognize because the matrix \(\left[\begin{array}{rrr}\vvec_1\amp\vvec_2\amp\vvec_3
\end{array}\right]\) has a pivot position in every row so that Proposition 2.3.5 applies.
However, the second example is very different. In this case, the matrix \(\left[\begin{array}{rrr}
\wvec_1\amp\wvec_2\amp\wvec_3 \end{array}\right]\) has only two pivot positions:
\begin{equation*}
\left[\begin{array}{rrr}
\wvec_1 \amp \wvec_2 \amp \wvec_3
\end{array}\right]
=
\left[\begin{array}{rrr}
0 \amp 3 \amp 3 \\
1 \amp 1 \amp 0 \\
2 \amp 1 \amp 1
\end{array}\right]
\sim
\left[\begin{array}{rrr}
1 \amp 0 \amp 1 \\
0 \amp 1 \amp 1 \\
0 \amp 0 \amp 0
\end{array}\right]\text{.}
\end{equation*}
Let's look at this matrix and change our perspective slightly by considering it to be an augmented matrix.
\begin{equation*}
\left[\begin{array}{rrr}
\wvec_1 \amp \wvec_2 \amp \wvec_3
\end{array}\right]
=
\left[\begin{array}{rrr}
0 \amp 3 \amp 3 \\
1 \amp 1 \amp 0 \\
2 \amp 1 \amp 1
\end{array}\right]
\sim
\left[\begin{array}{rrr}
1 \amp 0 \amp 1 \\
0 \amp 1 \amp 1 \\
0 \amp 0 \amp 0
\end{array}\right]
\end{equation*}
By doing so, we seek to express \(\wvec_3\) as a linear combination of \(\wvec_1\) and \(\wvec_2\text{.}\) In fact, the reduced row echelon form shows us that
\begin{equation*}
\wvec_3 = \wvec_1 + \wvec_2\text{.}
\end{equation*}
Consequently, we can rewrite any linear cominbation of \(\wvec_1\text{,}\) \(\wvec_2\text{,}\) and \(\wvec_3\) so that
\begin{equation*}
\begin{aligned}
c_1\wvec_1 + c_2\wvec_2 + c_2\wvec_3 \amp
{}={}
c_1\wvec_1 + c_2\wvec_2 + c_2(\wvec_1+\wvec_2) \\
\amp {}={}
(c_1+c_3)\wvec_1 + (c_2+c_3)\wvec_2 \\
\end{aligned}\text{.}
\end{equation*}
In other words, any linear combination of \(\wvec_1\text{,}\) \(\wvec_2\text{,}\) and \(\wvec_3\) may be written as a linear combination using only the vectors \(\wvec_1\) and \(\wvec_2\text{.}\) Since the span of a set of vectors is simply the set of their linear combinations, this shows that
\begin{equation*}
\span{\wvec_1,\wvec_2,\wvec_3} = \span{\wvec_1,\wvec_2}\text{.}
\end{equation*}
In other words, adding the vector \(\wvec_3\) to the set of vectors \(\wvec_1\) and \(\wvec_2\) does not change the span.
Before exploring this type of behavior more generally, let's think about this from a geometric point of view. In the first example, we begin with two vectors \(\vvec_1\) and \(\vvec_2\) and add a third vector \(\vvec_3\text{.}\)
Because the vector \(\vvec_3\) is not a linear combination of \(\vvec_1\) and \(\vvec_2\text{,}\) it provides a direction to move that, when creating linear combinations, is independent of \(\vvec_1\) and \(\vvec_2\text{.}\) The three vectors therefore span \(\real^3\text{.}\)
In the second example, however, the third vector \(\wvec_3\) is a linear combination of \(\wvec_1\) and \(\wvec_2\) so it already lies in the plane formed by these two vectors.
Since we can already move in this direction with just the first two vectors \(\wvec_1\) and \(\wvec_2\text{,}\) adding \(\wvec_3\) to the set does not enlarge the span. It remains a plane.
With these examples in mind, we will make the following definition.
Definition 2.4.1
A set of vectors \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is called linearly dependent if one of the vectors is a linear combination of the others. Otherwise, the set of vectors is called linearly independent.
For the sake of completeness, we say that a set of vectors containing only one vector is linearly independent if that vector is not the zero vector, \(\zerovec\text{.}\)
Subsection 2.4.2 How to recognize linear dependence
Activity 2.4.2
We would like to develop a means of detecting when a set of vectors is linearly dependent. These questions will point the way.

Suppose we have five vectors in \(\real^4\) that form the columns of a matrix having reduced row echelon form
\begin{equation*}
\left[\begin{array}{rrrrr}
\vvec_1 \amp \vvec_2 \amp \vvec_3 \amp
\vvec_4 \amp \vvec_5
\end{array}\right]
\sim
\left[\begin{array}{rrrrr}
1 \amp 0 \amp 1 \amp 0 \amp 2 \\
0 \amp 1 \amp 2 \amp 0 \amp 3 \\
0 \amp 0 \amp 0 \amp 1 \amp 1 \\
0 \amp 0 \amp 0 \amp 0 \amp 0 \\
\end{array}\right]\text{.}
\end{equation*}
Is it possible to write one of the vectors \(\vvec_1,\vvec_2,\ldots,\vvec_5\) as a linear combination of the others? If so, show explicitly how one vector appears as a linear combination of some of the other vectors. Is this set of vectors linearly dependent or independent?

Suppose we have another set of three vectors in \(\real^4\) that form the columns of a matrix having reduced row echelon form
\begin{equation*}
\left[\begin{array}{rrr}
\wvec_1 \amp \wvec_2 \amp \wvec_3 \\
\end{array}\right]
\sim
\left[\begin{array}{rrr}
1 \amp 0 \amp 0 \\
0 \amp 1 \amp 0 \\
0 \amp 0 \amp 1 \\
0 \amp 0 \amp 0 \\
\end{array}\right]\text{.}
\end{equation*}
Is it possible to write one of these vectors \(\wvec_1\text{,}\) \(\wvec_2\text{,}\) \(\wvec_3\) as a linear combination of the others? If so, show explicitly how one vector appears as a linear combination of some of the other vectors. Is this set of vectors linearly dependent or independent?
By looking at the pivot positions, how can you determine whether the columns of a matrix are linearly dependent or independent?
If one vector in a set is the zero vector \(\zerovec\text{,}\) can the set of vectors be linearly independent?
Suppose a set of vectors in \(\real^{10}\) has twelve vectors. Is it possible for this set to be linearly independent?
By now, it shouldn't be too surprising that the pivot positions play an important role in determining whether the columns of a matrix are linearly dependent. Let's discuss the previous activity to make this clear.

Let's consider the first example from the activity in which we have vectors in \(\real^4\) such that
\begin{equation*}
\left[\begin{array}{rrrrr}
\vvec_1 \amp \vvec_2 \amp \vvec_3 \amp
\vvec_4 \amp \vvec_5
\end{array}\right]
\sim
\left[\begin{array}{rrrrr}
1 \amp 0 \amp 1 \amp 0 \amp 2 \\
0 \amp 1 \amp 2 \amp 0 \amp 3 \\
0 \amp 0 \amp 0 \amp 1 \amp 1 \\
0 \amp 0 \amp 0 \amp 0 \amp 0 \\
\end{array}\right]\text{.}
\end{equation*}
Notice that the third column does not contain a pivot position. Let's focus on the first three columns and consider them as an augmented matrix:
\begin{equation*}
\left[\begin{array}{rrr}
\vvec_1 \amp \vvec_2 \amp \vvec_3
\end{array}\right]
\sim
\left[\begin{array}{rrr}
1 \amp 0 \amp 1 \\
0 \amp 1 \amp 2 \\
0 \amp 0 \amp 0 \\
0 \amp 0 \amp 0 \\
\end{array}\right]\text{.}
\end{equation*}
There is not a pivot in the rightmost column so we know that \(\vvec_3\) can be written as a linear combination of \(\vvec_1\) and \(\vvec_2\text{.}\) In fact, we can read the weights from the augmented matrix:
\begin{equation*}
\vvec_3 = \vvec_1 + 2\vvec_2\text{.}
\end{equation*}
This says that the set of vectors \(\vvec_1,\vvec_2,\ldots,\vvec_5\) is linearly dependent.
This points to the general observation that a set of vectors is linearly dependent if the matrix they form has a column without a pivot.
In addition, the fifth column of this matrix does not contain a pivot meaning that \(\vvec_5\) can be written as a linear combination:
\begin{equation*}
\vvec_5 = 2\vvec_1 + 3\vvec_2 \vvec_4\text{.}
\end{equation*}
This example shows that vectors in columns that do not contain a pivot may be expressed as a linear combination of the vectors in columns that do contain pivots. In this case, we have
\begin{equation*}
\span{\vvec_1,\vvec_2,\vvec_3,\vvec_4,\vvec_5}
=\span{\vvec_1,\vvec_2,\vvec_4}\text{.}
\end{equation*}

Conversely, the second set of vectors we studied produces a matrix with a pivot in every column.
\begin{equation*}
\left[\begin{array}{rrr}
\wvec_1 \amp \wvec_2 \amp \wvec_3 \\
\end{array}\right]
\sim
\left[\begin{array}{rrr}
1 \amp 0 \amp 0 \\
0 \amp 1 \amp 0 \\
0 \amp 0 \amp 1 \\
0 \amp 0 \amp 0 \\
\end{array}\right]\text{.}
\end{equation*}
If we interpret this as an augmented matrix again, we see that the linear system is inconsistent since there is a pivot in the rightmost column. This means that \(\wvec_3\) cannot be expressed as a linear combination of \(\wvec_1\) and \(\wvec_2\text{.}\)
Similarly, \(\wvec_2\) cannot be expressed as a linear combination of \(\wvec_1\text{.}\) In addition, if \(\wvec_2\) could be expressed as a linear combination of \(\wvec_1\) and \(\wvec_3\text{,}\) we could rearrange that expression to write \(\wvec_3\) as a linear combination of \(\wvec_1\) and \(\wvec_2\text{,}\) which we know is impossible.
We can therefore conclude that \(\wvec_1\text{,}\) \(\wvec_2\text{,}\) and \(\wvec_3\) form a linearly indpendent set of vectors.
This leads to the following proposition.
Proposition 2.4.2
The columns of a matrix are linearly independent if and only if every column contains a pivot position.
This condition imposes a constraint on how many vectors we can have in a linearly independent set. Here is an example of the reduced row echelon form of a matrix having linearly independent columns. Notice that there are three vectors in \(\real^5\) so there are at least as many rows as columns.
\begin{equation*}
\left[\begin{array}{rrr}
1 \amp 0 \amp 0 \\
0 \amp 1 \amp 0 \\
0 \amp 0 \amp 1 \\
0 \amp 0 \amp 0 \\
0 \amp 0 \amp 0 \\
\end{array}\right]\text{.}
\end{equation*}
More generally, suppose that \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is a linearly independent set of vectors in \(\real^m\text{.}\) When these vectors form the columns of a matrix, there must be a pivot position in every column of the matrix. Since every row contains at most one pivot position, the number of columns can be no greater than the number of rows. This means that the number of vectors in a linearly independent set can be no greater than the number of dimensions.
Proposition 2.4.3
A linearly independent set of vectors in \(\real^m\) can contain no more than \(m\) vectors.
This says, for instance, that any linearly independent set of vectors in \(\real^3\) can contain no more three vectors. Once again, this makes intuitive sense. We usually imagine three independent directions, such as up/down, front/back, left/right, in our threedimensional world. This proposition tells us that there can be no more independent directions.
Subsection 2.4.3 The homogeneous equation
Given an \(m\times n\) matrix \(A\text{,}\) we call the equation \(A\xvec = \zerovec\) a homogenous equation. The solutions to this equation reflect whether the columns of \(A\) are linearly independent or not.
Activity 2.4.3 Linear independence and homogeneous equations
Explain why the homogenous equation \(A\xvec =
\zerovec\) is consistent no matter the matrix \(A\text{.}\)

Consider the matrix
\begin{equation*}
A = \left[\begin{array}{rrr}
3 \amp 2 \amp 0 \\
1 \amp 0 \amp 2 \\
2 \amp 1 \amp 1
\end{array}\right]
\end{equation*}
whose columns we denote by \(\vvec_1\text{,}\) \(\vvec_2\text{,}\) and \(\vvec_3\text{.}\) Are the vectors \(\vvec_1\text{,}\) \(\vvec_2\text{,}\) and \(\vvec_3\) linearly dependent or independent?
Give a description of the solution space of the homogeneous equation \(A\xvec = \zerovec\text{.}\)

We know that \(\zerovec\) is a solution to the homogeneous equation. Find another solution that is different from \(\zerovec\text{.}\) Use your solution to find weights \(c_1\text{,}\) \(c_2\text{,}\) and \(c_3\) such that
\begin{equation*}
c_1\vvec_1 + c_2\vvec_2 + c_3\vvec_3 = \zerovec\text{.}
\end{equation*}
Use the expression you found in the previous part to write one of the vectors as a linear combination of the others.
For any matrix \(A\text{,}\) we know that the equation \(A\xvec =
\zerovec\) has at least one solution, namely, the vector \(\xvec = \zerovec\text{.}\) We call this the trivial solution to the homogeneous equation so that any other solution that exists is a \(nontrivial\) solution.
If there is no nontrivial solution, then \(A\xvec =
\zerovec\) has exactly one solution. There can be no free variables in a description of the solution space so \(A\) must have a pivot position in every column. In this case, the columns of \(A\) must be linearly independent.
If, however, there is a nontrivial solution, then there are infinitely many solutions so \(A\) must have a column without a pivot position. Hence, the columns of \(A\) are linearly dependent.
Example 2.4.4
We will make the connection between solutions to the homogeneous equation and the linear independence of the columns more explict by looking at an example. In particular, we will demonstrate how a nontrivial solution to the homogeneous equation shows that one column of \(A\) is a linear combination of the others. With the matrix \(A\) in the previous activity, the homogeneous equation has the reduced row echelon form
\begin{equation*}
\left[\begin{array}{rrrr}
3 \amp 2 \amp 0 \amp 0 \\
1 \amp 0 \amp 2 \amp 0 \\
2 \amp 1 \amp 1 \amp 0 \\
\end{array}\right]
\sim
\left[\begin{array}{rrrr}
1 \amp 0 \amp 2 \amp 0 \\
0 \amp 1 \amp 3 \amp 0 \\
0 \amp 0 \amp 0 \amp 0 \\
\end{array}\right]\text{,}
\end{equation*}
which implies that
\begin{equation*}
\begin{alignedat}{4}
x_1 \amp \amp \amp {}+{} \amp 2x_3 \amp {}={} \amp 0 \\
\amp \amp x_2 \amp {}{} \amp 3x_3 \amp {}={} \amp 0 \\
\end{alignedat}\text{.}
\end{equation*}
In terms of the free variable \(x_3\text{,}\) we have
\begin{equation*}
\begin{aligned}
x_1 \amp {}={} 2x_3 \\
x_2 \amp {}={} 3x_3 \\
\end{aligned}\text{.}
\end{equation*}
Any choice for a value of the free variable \(x_3\) produces a solution so let's choose, for convenience, \(x_3=1\text{.}\) We then have \((x_1,x_2,x_3) = (2,3,1)\text{.}\)
Since \((2,3,1)\) is a solution to the homogeneous equation \(A\xvec=\zerovec\text{,}\) this solution gives weights for a linear combination of the columns of \(A\) that create \(\zerovec\text{.}\) That is,
\begin{equation*}
2\vvec_1 + 3\vvec_2 + \vvec_3 = \zerovec\text{,}
\end{equation*}
which we rewrite as
\begin{equation*}
\vvec_3 = 2\vvec_1  3\vvec_2\text{.}
\end{equation*}
As this example demonstrates, there are many ways we can view the question of linear independence. We will record some of these ways in the following proposition.
Proposition 2.4.5
For a matrix \(A = \left[\begin{array}{rrrr}
\vvec_1\amp\vvec_2\amp\ldots\amp\vvec_n
\end{array}\right]
\text{,}\) the following statements are equivalent:
The columns of \(A\) are linearly dependent.
One of the vectors in the set \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is a linear combination of the others.
The matrix \(A\) has a column without a pivot position.
The homogeneous equation \(A\xvec = \zerovec\) has a nontrivial solution.

There are weights \(c_1,c_2,\ldots,c_n\text{,}\) not all of which are zero, such that
\begin{equation*}
c_1\vvec_1 + c_2\vvec_2 + \ldots + c_n\vvec_n = \zerovec\text{.}
\end{equation*}
Subsection 2.4.4 Summary
In this section, we developed the concept of linear dependence of a set of vectors. At the beginning of the section, we said that this concept addressed the second of our fundamental questions, expressed in Question 1.4.2, concerning the uniqueness of solutions to a linear system. It is worth comparing the results of this section with those of the previous one so that the parallels between them become clear.
As is usual, we will write a matrix as a collection of vectors,
\begin{equation*}
A=\left[\begin{array}{rrrr} \vvec_1\amp\vvec_2 \amp
\ldots \amp \vvec_n \end{array}\right].
\end{equation*}
 Existence
In the previous section, we asked if we could write a vector \(\bvec\) as a linear combination of the columns of \(A\text{,}\) which happens precisely when a solution to the equation \(A\xvec = \bvec\) exists. We saw that every vector \(\bvec\) could be expressed as a linear combination of the columns of \(A\) when \(A\) has a pivot position in every row. In this case, we said that the span of the vectors \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is \(\real^m\text{.}\) We saw that at least \(m\) vectors are needed to span \(\real^m\text{.}\)
 Uniqueness
In this section, we studied the uniqueness of solutions to the equation \(A\xvec = \zerovec\text{,}\) which is always consistent. When a nontrivial solution exists, we saw that one vector of the set \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is a linear combination of the others, in which case we said that the set of vectors is linearly dependent. This happens when the matrix \(A\) has a column without a pivot position. We saw that there can be no more than \(m\) vectors in a set of linearly independent vectors in \(\real^m\text{.}\)
To summarize the specific results of this section, we saw that:
A set of vectors is linearly dependent if one of the vectors is a linear combination of the others.
A set of vectors is linearly independent if and only if the vectors form a matrix that has a pivot position in every column.
A set of linearly independent vectors in \(\real^m\) contains no more than \(m\) vectors.
The columns of the matrix \(A\) are linearly dependent if the homogeneous equation \(A\xvec = \zerovec\) has a nontrivial solution.

A set of vectors \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is linearly dependent if there are weights \(c_1,c_2,\ldots,c_n\text{,}\) not all of which are zero, such that
\begin{equation*}
c_1\vvec_1 + c_2\vvec_2 + \ldots + c_n\vvec_n = \zerovec\text{.}
\end{equation*}
Subsection 2.4.5 Exercises
1
Consider the set of vectors
\begin{equation*}
\vvec_1 = \threevec{1}{2}{1},
\vvec_2 = \threevec{0}{1}{3},
\vvec_3 = \threevec{2}{3}{1},
\vvec_4 = \threevec{2}{4}{1}\text{.}
\end{equation*}
Explain why this set of vectors is linearly dependent.
Write one of the vectors as a linear combination of the others.

Find weights \(c_1\text{,}\) \(c_2\text{,}\) \(c_3\text{,}\) and \(c_4\text{,}\) not all of which are zero, such that
\begin{equation*}
c_1\vvec_1 + c_2 \vvec_2 + c_3 \vvec_3
+ c_4 \vvec_4 = \zerovec\text{.}
\end{equation*}
Find a nontrivial solution to the homogenous equation \(A\xvec = \zerovec\) where \(A=\left[\begin{array}{rrrr}
\vvec_1\amp\vvec_2\amp\vvec_3\amp\vvec_4
\end{array}\right]\text{.}\)
2
Consider the vectors
\begin{equation*}
\vvec_1 = \threevec{2}{1}{0},
\vvec_2 = \threevec{1}{2}{1},
\vvec_3 = \threevec{2}{2}{3}\text{.}
\end{equation*}
Are these vectors linearly independent or linearly dependent?
Describe the \(\span{\vvec_1,\vvec_2,\vvec_3}\text{.}\)
Suppose that \(\bvec\) is a vector in \(\real^3\text{.}\) Explain why we can guarantee that \(\bvec\) may be written as a linear combination of \(\vvec_1\text{,}\) \(\vvec_2\text{,}\) and \(\vvec_3\text{.}\)
Suppose that \(\bvec\) is a vector in \(\real^3\text{.}\) In how many ways can \(\bvec\) be written as a linear combination of \(\vvec_1\text{,}\) \(\vvec_2\text{,}\) and \(\vvec_3\text{?}\)
3
Answer the following questions and provide a justification for your responses.
If the vectors \(\vvec_1\) and \(\vvec_2\) form a linearly dependent set, must one vector be a scalar multiple of the other?
Suppose that \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is a linearly independent set of vectors. What can you say about the linear independence or dependence of a subset of these vectors?
Suppose \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is a linearly independent set of vectors that form the columns of a matrix \(A\text{.}\) If the equation \(A\xvec = \bvec\) is inconsistent, what can you say about the linear independence or dependence of the set of vectors \(\vvec_1,\vvec_2,\ldots,\vvec_n,\bvec\text{?}\)
4
Determine if the following statements are true or false and provide a justification for your response.
If \(\vvec_1,\vvec_2,\ldots,\vvec_n\) are linearly dependent, then one vector is a scalar multiple of one of the others.
If \(\vvec_1, \vvec_2, \ldots, \vvec_{10}\) are vectors in \(\real^5\text{,}\) then the set of vectors is linearly dependent.
If \(\vvec_1, \vvec_2, \ldots, \vvec_{5}\) are vectors in \(\real^{10}\text{,}\) then the set of vectors is linearly independent.
Suppose we have a set of vectors \(\vvec_1,\vvec_2,\ldots,\vvec_n\) and that \(\vvec_2\) is a scalar multiple of \(\vvec_1\text{.}\) Then the set is linearly dependent.
Suppose that \(\vvec_1,\vvec_2,\ldots,\vvec_n\) are linearly independent and form the columns of a matrix \(A\text{.}\) If \(A\xvec = \bvec\) is consistent, then there is exactly one solution.
5
Suppose we have a set of vectors \(\vvec_1,\vvec_2,\vvec_3,\vvec_4\) in \(\real^5\) that satisfy the relationship:
\begin{equation*}
2\vvec_1  \vvec_2 + 3\vvec_3 + \vvec_4 = \zerovec
\end{equation*}
and suppose that \(A\) is the matrix \(A=\left[\begin{array}{rrrr}
\vvec_1\amp\vvec_2\amp\vvec_3\amp\vvec_4
\end{array}\right]
\text{.}\)
Find a nontrivial solution to the equation \(A\xvec =
\zerovec\text{.}\)
Explain why the matrix \(A\) has a column without a pivot position.
Write one of the vectors as a linear combination of the others.
Explain why the set of vectors is linearly dependent.
6
Suppose that \(\vvec_1,\vvec_2,\ldots,\vvec_n\) is a set of vectors in \(\real^{27}\) that form the columns of a matrix \(A\text{.}\)
Suppose that the vectors span \(\real^{27}\text{.}\) What can you say about the number of vectors \(n\) in this set?
Suppose instead that the vectors are linearly independent. What can you say about the number of vectors \(n\) in this set?
Suppose that the vectors are both linearly independent and span \(\real^{27}\text{.}\) What can you say about the number of vectors in the set?
Assume that the vectors are both linearly independent and span \(\real^{27}\text{.}\) Given a vector \(\bvec\) in \(\real^{27}\text{,}\) what can you say about the solution space to the equation \(A\xvec = \bvec\text{?}\)
7
Given below are some descriptions of sets of vectors that form the columns of a matrix \(A\text{.}\) For each description, give a possible reduced row echelon form for \(A\) or indicate why there is no set of vectors satisfying the description by stating why the required reduced row echelon matrix cannot exist.
A set of 4 linearly independent vectors in \(\real^5\text{.}\)
A set of 4 linearly independent vectors in \(\real^4\text{.}\)
A set of 3 vectors that span \(\real^4\text{.}\)
A set of 5 linearly independent vectors in \(\real^3\text{.}\)
A set of 5 vectors that span \(\real^4\text{.}\)
8
When we explored matrix multiplication in Section 2.2, we saw that some properties that are true for real numbers are not true for matrices. This exercise will investigate that in some more depth.
Suppose that \(A\) and \(B\) are two matrices and that \(AB = 0\text{.}\) If \(B \neq 0\text{,}\) what can you say about the linear independence of the columns of \(A\text{?}\)
Suppose that we have matrices \(A\text{,}\) \(B\) and \(C\) such that \(AB = AC\text{.}\) We have seen that we cannot generally conclude that \(B=C\text{.}\) If we assume additionally that \(A\) is a matrix whose columns are linearly independent, explain why \(B = C\text{.}\) You may wish to begin by rewriting the equation \(AB = AC\) as \(ABAC = A(BC) = 0\text{.}\)
9
Suppose that \(k\) is an unknown parameter and consider the set of vectors
\begin{equation*}
\vvec_1 = \threevec{2}{0}{1},
\vvec_2 = \threevec{4}{2}{1},
\vvec_1 = \threevec{0}{2}{k}\text{.}
\end{equation*}
For what values of \(k\) is the set of vectors linearly dependent?
For what values of \(k\) does the set of vectors span \(\real^3\text{?}\)
10
Given a set of linearly dependent vectors, we can eliminate some of the vectors to create a smaller, linearly independent set of vectors.
Suppose that \(\wvec\) is a linear combination of the vectors \(\vvec_1\) and \(\vvec_2\text{.}\) Explain why \(\span{\vvec_1,\vvec_2, \wvec} =
\span{\vvec_1,\vvec_2}\text{.}\)

Consider the vectors
\begin{equation*}
\vvec_1 = \threevec{2}{1}{0},
\vvec_2 = \threevec{1}{2}{1},
\vvec_3 = \threevec{2}{6}{2},
\vvec_4 = \threevec{7}{1}{1}\text{.}
\end{equation*}
Write one of the vectors as a linear combination of the others. Find a set of three vectors whose span is the same as \(\span{\vvec_1,\vvec_2,\vvec_3,\vvec_4}\text{.}\)
Are the three vectors you are left with linearly independent? If not, express one of the vectors as a linear combination of the others and find a set of two vectors whose span is the same as \(\span{\vvec_1,\vvec_2,\vvec_3,\vvec_4}\text{.}\)
Give a geometric description of \(\span{\vvec_1,\vvec_2,\vvec_3,\vvec_4}\) in \(\real^3\) as we did in Section 2.3.