Linear Algebra : Operations and Properties

Study concepts, example questions & explanations for Linear Algebra

varsity tutors app store varsity tutors android store

Example Questions

Example Question #201 : Operations And Properties

Determine the row rank of the matrix

\(\displaystyle \begin{pmatrix} 1&4 &5 \\ 2&8 &10 \\ 3&6 &4 \end{pmatrix}\)

Possible Answers:

\(\displaystyle 0\)

\(\displaystyle 2\)

\(\displaystyle 1\)

\(\displaystyle 3\)

Correct answer:

\(\displaystyle 2\)

Explanation:

To determine the row rank of the matrix we reduce the matrix into reduced echelon form.

First we add \(\displaystyle -2\) times the 1st row to the 2nd row

\(\displaystyle \begin{pmatrix} 1&4 &5 \\ 2+(-2)&8+(-8) &10+(-10) \\ 3&6 &4 \end{pmatrix}=\begin{pmatrix} 1&4 &5 \\ 0&0 &0 \\ 3&6 &4 \end{pmatrix}\)

add \(\displaystyle -3\) times the 1st row to the 3rd row

\(\displaystyle \begin{pmatrix} 1&4 &5 \\ 0&0 &0 \\ 3+(-3)&6+(-12) &4+(-15) \end{pmatrix}=\begin{pmatrix} 1&4 &5 \\ 0&0 &0 \\ 0&-6 &-11 \end{pmatrix}\)

Switch the 2nd row and the 3rd row

\(\displaystyle \begin{pmatrix} 1&4 &5 \\ 0&-6 &-11 \\ 0&0 &0 \end{pmatrix}\)

multiply the 2nd row by \(\displaystyle -\frac{1}{6}\)

\(\displaystyle \begin{pmatrix} 1&4 &5 \\ 0&-6*(-\frac{1}{6}) &-11*(-\frac{1}{6}) \\ 0&0 &0 \end{pmatrix}=\begin{pmatrix} 1& 4& 5\\ 0&1 &\frac{11}{6} \\ 0&0 &0 \end{pmatrix}\)

add \(\displaystyle -4\) times the 2nd row to the 1st row

\(\displaystyle \begin{pmatrix} 1& 4+(-4)& 5+(-\frac{44}{6})\\ 0&1 &\frac{11}{6} \\ 0&0 &0 \end{pmatrix} = \begin{pmatrix} 1&0 &-\frac{7}{3} \\ 0&1 &\frac{11}{6} \\ 0&0 & 0 \end{pmatrix}\)

And we find that the row rank is \(\displaystyle 2\)

Example Question #202 : Operations And Properties

 

 

Consider the following set of vectors

\(\displaystyle \b{v}_1= (1,0,0)\) 

\(\displaystyle \b{v}_2= (0,1,0)\) 

\(\displaystyle \b{v}_3= (0,0,1)\) 

Is the the set linearly independent?

Possible Answers:

No.

Not enough information

Yes.

Correct answer:

Yes.

Explanation:

Yes, the set is linearly independent. There are multiple ways to see this

Way 1) Put the vectors into matrix form,

\(\displaystyle \begin{bmatrix} 1 &0 &0 \\0 &1 &0 \\0 &0 &1 \end{bmatrix}\)

The matrix is already in reduced echelon form. Notice there are three rows that have a nonzero number in them and we started with 3 vectors. Thus the set is linearly independent.

Way 2) Consider the equation \(\displaystyle \alpha \b{v}_1 + \beta \b{v}_2 + \gamma \b{v}_3 = 0\)

If when we solve the equation, we get \(\displaystyle \alpha = \beta = \gamma = 0\) then it is linearly independent. Let's solve the equation and see what we get.

\(\displaystyle \alpha \b{v}_1 + \beta \b{v}_2 + \gamma \b{v}_3 =\b{0}\)

\(\displaystyle \alpha (1, 0, 0) +\beta (0,1,0) +\gamma(0,0,1) = (0,0,0)\)

Distribute the scalar constants to get

\(\displaystyle (\alpha, 0, 0) + (0,\beta,0) +(0,0,\gamma) = (0,0,0)\)

Thus we get a system of 3 equations

\(\displaystyle \alpha = 0\)

\(\displaystyle \beta = 0\)

\(\displaystyle \gamma = 0\)

Since \(\displaystyle \alpha = \beta = \gamma = 0\) the vectors are linearly independent.

Example Question #13 : Linear Independence And Rank

Consider the following set of vectors

\(\displaystyle \b{v}_1= (1,0,0)\) 

\(\displaystyle \b{v}_2= (0,1,0)\) 

\(\displaystyle \b{v}_3= (0,0,1)\) 

\(\displaystyle \v_4 = (2, 1, 5)\)\(\displaystyle \b{v}_4 = (2,1,5)\) 

Is the the set linearly independent?

Possible Answers:

Yes

Not enough information

No

Correct answer:

No

Explanation:

The vectors have dimension 3. Therefore the largest possible size for a linearly independent set is 3. But there are 4 vectors given. Thus, the set cannot be linearly independent and must be linearly dependent

Another way to see this is by noticing that \(\displaystyle \b{v}_4\) can be written as a linear combination of the other vectors:

\(\displaystyle \b{v}_4 = 2 \b{v}_1 + 1 \b{v}_2 + 5 \b{v}_3\)

Example Question #14 : Linear Independence And Rank

In a vector space with dimension 5, what is the maximum number of vectors that can be in a linearly independent set?

Possible Answers:

Five

There is no limit

Two

Not enough information

Ten

Correct answer:

Five

Explanation:

The dimension of a vector space is the maximum number of vectors possible in a linearly independent set. (notice you can have linearly independent sets with 5 or less, but never more than 5)

Example Question #15 : Linear Independence And Rank

What is the dimension of the space spanned by the following vectors:

\(\displaystyle \b{v}_1 = (1,0,0,0,0)\)

\(\displaystyle \b{v}_2 = (0,1,0,0,0)\)

\(\displaystyle \b{v}_3 = (0,0,0,0,1)\)

Possible Answers:

Three

Five

One

Not enough information

Six

Correct answer:

Three

Explanation:

Since there are three linearly independent vectors, they span a 3 dimensional space. 

Notice that the vectors each have 5 coordinates to them. Therefore they actually span a 3 dimensional subspace of a 5 dimensional space.

Example Question #16 : Linear Independence And Rank

What is the dimension of the space spanned by the following vectors:

\(\displaystyle \b{v}_1 = (1,1,0,0,0)\)

\(\displaystyle \b{v}_2 = (0,0,1,1,0)\)

\(\displaystyle \b{v}_3 = (0,0,0,0,1)\)

Possible Answers:

Not enough information

Two

Three

One

Five

Correct answer:

Three

Explanation:

Since there are three linearly independent vectors, they span a 3 dimensional space.

Notice that the vectors each have 5 coordinates to them. Therefore they actually span a 3 dimensional subspace of a 5 dimensional space.

Example Question #17 : Linear Independence And Rank

True or False: If a \(\displaystyle 5 \times 5\) matrix \(\displaystyle A\) has \(\displaystyle 3\) linearly independent columns, then \(\displaystyle nullity(A) =2\).

Possible Answers:

False

True

Correct answer:

True

Explanation:

Since \(\displaystyle A\) is a \(\displaystyle 5 \times 5\) matrix, \(\displaystyle rank(A)+nullity(A) = 5\). Since \(\displaystyle A\) has three linearly independent columns, it must have a column space (and hence row space) of dimension \(\displaystyle 3\), causing \(\displaystyle rank(A)=3\) by the definition of rank. Hence\(\displaystyle nullity(A)=2\).

Example Question #21 : Linear Independence And Rank

If \(\displaystyle A= \begin{bmatrix} 1& 0&1 \\ 1& 1& 1\\ 1& 0 &1 \end{bmatrix}\), what is \(\displaystyle rank(A)\)?

Possible Answers:

\(\displaystyle 0\)

None of the other answers

\(\displaystyle 2\)

\(\displaystyle 1\)

\(\displaystyle 3\)

Correct answer:

\(\displaystyle 2\)

Explanation:

\(\displaystyle rank(A)\) is equal to the number of linearly independent columns of \(\displaystyle A\). The first and third columns are the same, so one of these columns is redundant in the column space of \(\displaystyle A\). The second column evidently cannot be a multiple of the first, since the second has two \(\displaystyle 0\)'s, and the first has none. Hence \(\displaystyle rank(A)=2\).

Example Question #286 : Linear Algebra

Consider the polynomials 

\(\displaystyle p_{1}(x) = x^{3}+ x^{2}-x+ 1\)

\(\displaystyle p_{2}(x) = x^{3}+2 x^{2}+ 2x+ 1\)

\(\displaystyle p_{3}(x) = x^{2}+ x+ 1\)

\(\displaystyle p_{4}(x) = x^{2}+ x\)

True or false: these four polynomials form a basis for \(\displaystyle P_{3}\), the set of all polynomials with degree less than or equal to 3.

Possible Answers:

True

False

Correct answer:

True

Explanation:

A test to determine whether these matrices form a basis is to set up a matrix with each row comprising the coefficients of one polynomial, and performing row reductions until the matrix is in row-echelon form.  \(\displaystyle P_{3}\) is a vector space of dimension 4, so these four polynomials will form a basis if and only if the resulting  \(\displaystyle 4 \times 4\) matrix has rank 4. The initial matrix is:

\(\displaystyle \begin{bmatrix} 1 & 1 & -1 & 1 \\ 1 & 2 & 2 & 1 \\ 0 & 1 & 1 & 1 \\ 0 & 1 & 1 & 0 \end{bmatrix}\)

Perform the following row operations:

\(\displaystyle -R1+ R2 \rightarrow R2\)

\(\displaystyle \begin{bmatrix} 1 & 1 & 1 & 1 \\ 0 &1& 3 & 0 \\ 0 & 1 & 1 & 1 \\ 0 & 1 & 1 & 0 \end{bmatrix}\)

\(\displaystyle -R2+ R3 \rightarrow R3\)

\(\displaystyle -R2+ R4 \rightarrow R4\)

\(\displaystyle \begin{bmatrix} 1 & 1 & 1 & 1 \\ 0 &1& 3 & 0 \\ 0 & 0 & -2 & 1 \\ 0 & 0 & -2 & 0 \end{bmatrix}\)

\(\displaystyle -\frac{1}{2} R3 \rightarrow R3\)

\(\displaystyle \begin{bmatrix} 1 & 1 & 1 & 1 \\ 0 &1& 3 & 0 \\ 0 & 0 & 1 & -\frac{1}{2}\\ 0 & 0 & -2 & 0 \end{bmatrix}\)

\(\displaystyle 2R3 + R 4 \rightarrow R4\)

\(\displaystyle \begin{bmatrix} 1 & 1 & 1 & 1 \\ 0 &1& 3 & 0 \\ 0 & 0 & 1 & -\frac{1}{2}\\ 0 & 0 & 0 & -1\end{bmatrix}\)

\(\displaystyle -R4 \rightarrow R4\)

\(\displaystyle \begin{bmatrix} 1 & 1 & 1 & 1 \\ 0 &1& 3 & 0 \\ 0 & 0 & 1 & -\frac{1}{2}\\ 0 & 0 & 0 &1\end{bmatrix}\)

The matrix is now in row-echelon form. Each row has a leading 1, so the matrix has rank 4, the dimension of \(\displaystyle P_{3}\). It follows that the given polynomials comprise a basis of \(\displaystyle P_{3}\).

Example Question #23 : Linear Independence And Rank

Consider the polynomials:

\(\displaystyle p_{1}(x) = x^{3}+ x^{2}+ x+ 1\)

\(\displaystyle p_{2}(x) = x^{3}+2 x^{2}+ 2x+ 1\)

\(\displaystyle p_{3}(x) = x^{2}+ x+ 1\)

\(\displaystyle p_{4}(x) = x^{2}+ x\)

True or false: these four polynomials form a basis for \(\displaystyle P_{3}\), the set of all polynomials with degree less than or equal to 3.

Possible Answers:

True

False

Correct answer:

False

Explanation:

Elements of a vector space form a basis if the elements are linearly independent and if they span the space - that is, every element in that space can be uniquely expressed as the sum of the elements. 

A test to determine whether these matrices form a basis is to set up a matrix with each row comprising the coefficients of one polynomial, and performing row reductions until the matrix is in row-echelon form.  \(\displaystyle P_{3}\) is a vector space of dimension 4, so these four polynomials will form a basis if and only if the resulting  \(\displaystyle 4 \times 4\) matrix has rank 4. The initial matrix is:

\(\displaystyle \begin{bmatrix} 1 & 1 & 1 & 1 \\ 1 & 2 & 2 & 1 \\ 0 & 1 & 1 & 1 \\ 0 & 1 & 1 & 0 \end{bmatrix}\)

Perform the following row operations:

\(\displaystyle -R1+ R2 \rightarrow R2\)

\(\displaystyle \begin{bmatrix} 1 & 1 & 1 & 1 \\ 0 &1& 1 & 0 \\ 0 & 1 & 1 & 1 \\ 0 & 1 & 1 & 0 \end{bmatrix}\)

\(\displaystyle -R2+ R3 \rightarrow R3\)

\(\displaystyle -R2+ R4 \rightarrow R4\)

\(\displaystyle \begin{bmatrix} 1 & 1 & 1 & 1 \\ 0 &1& 1 & 0 \\ 0 & 0 & 0& 1 \\ 0 & 0 & 0 & 0 \end{bmatrix}\)

The matrix is now in row-echelon form. There is a row of zeroes at the bottom, so the rank of the matrix is 3. Therefore, the four polynomials are not linearly independent, and they do not form a basis for \(\displaystyle P_{3}\).

Learning Tools by Varsity Tutors