FALSE

Matrix multiplication is “row by column”.

Matrix multiplication is “row by column”.

If A and B are 2 x 2 matrices with columns a1, a2, and b1, b2, respectively, then AB = [a1b1 a2b2].

FALSE

Swap A and B then its true

Swap A and B then its true

Each column of AB is a linear combination of the columns of B using weights from the corresponding column of A.

TRUE

AB + AC = A(B + C)

TRUE

A^T + B^T = (A + B)^T

FALSE

The transpose of a product of matrices equals the product of their transposes in the reverse order.

The transpose of a product of matrices equals the product of their transposes in the reverse order.

I The transpose of a product of matrices equals the product of their transposes in the same order.

FALSE This is right but there

should not be +’s in the solution. Remember the answer

should also be 3 x 3.

should not be +’s in the solution. Remember the answer

should also be 3 x 3.

I If A and B are 3 x 3 and B = [b1 b2 b3], then AB = [Ab1 + Ab2 + Ab3].

TRUE

The second row of AB is the second row of A multiplied on the right by B.

FALSE

Matrix multiplication is not commutative.

Matrix multiplication is not commutative.

(AB)C = (AC)B

FALSE

(AB)^T = B^T A^T

(AB)^T = B^T A^T

(AB)^T = A^T B^T

TRUE

The transpose of a sum of matrices equals the sum of their transposes.

FALSE.

It is invertible, but the inverses in the product of the inverses in the reverse order.

It is invertible, but the inverses in the product of the inverses in the reverse order.

A product of invertible n x n matrices is invertible, and the inverse of the product of their matrices in the same order.

TRUE

If A is invertible, then the inverse of A^-1 is A itself.

TRUE

If A = [a b; c d] and ad = bc, then A is not invertible.

TRUE

If A can be row reduced to the identity matrix, then A must be invertible.

FALSE

They also reduce the identity to A^-1

They also reduce the identity to A^-1

I If A is invertible, then elementary row operations then reduce A to to the identity also reduce A^-1 to the identity

TRUE

An n x n determinant is defined by determinants of (n – 1) x (n – 1) submatrices.

FALSE

The cofactor is the determinant of this Aij times -1^i+j

The cofactor is the determinant of this Aij times -1^i+j

The (i , j)-cofactor of a matrix A is the matrix Aij obtained by deleting from A its ith row and jth column.

FALSE

We can expand down any row or column and get same determinant.

We can expand down any row or column and get same determinant.

The cofactor expansion of det A down a column is the negative of the cofactor expansion along a row

FALSE

It is the product of the diagonal entries.

It is the product of the diagonal entries.

The determinant of a triangular matrix is the sum of the entries of the main diagonal.

TRUE

A row replacement operation does not affect the determinant of a matrix.

FALSE

If we scale any rows when getting the echelon form, we change the determinant

If we scale any rows when getting the echelon form, we change the determinant

The determinant of A is the product of the pivots in any echelon form U of A, multiplied by (-1)^r, where r is the number of row interchanges made during row reduction from A to U

TRUE

If the columns of A are linearly dependent, then det A = 0.

FALSE

This is true for product however.

This is true for product however.

det(A + B) = det A + det B

TRUE

If two row interchanges are made in succession, then the new determinant equals the old determinant

FALSE

unless A is triangular

unless A is triangular

The determinant of A is the product of the diagonal entries in A

FALSE

The converse is true, however.

The converse is true, however.

If det A is zero, then two rows or two columns are the same, or a row or a column is zero.

FALSE

det(A^T) = detA when A is n x n.

det(A^T) = detA when A is n x n.

det(A^T) = (-1)detA

FALSE

we need f(t) = 0 for all t

we need f(t) = 0 for all t

If f is a function in the vector space V of all real-valued functions on R and if f(t) = 0 for some t, then f is the zero

vector in V

vector in V

FALSE

This is an example of a vector, but there are certainly vectors not of this form.

This is an example of a vector, but there are certainly vectors not of this form.

A vector is an arrow in three-dimensional space.

FALSE

We also need the set to be closed under

addition and scalar multiplication.

We also need the set to be closed under

addition and scalar multiplication.

A subset H of a vector space V, is a subspace of V if the zero vector is in H

TRUE

A subspace is also a vector space.

FALSE

digital signals are used

digital signals are used

Analogue signals are used in the major control systems for the space shuttle, mentioned in the introduction to the chapter

TRUE

A vector is any element of a vector space

TRUE

If u is a vector in a vector space V, then (-1)u is the same as the negative of u.

TRUE

A vector space is also a subspace.

FALSE

The elements in R^2 aren’t even in R^3

The elements in R^2 aren’t even in R^3

R^2 is a subspace of R^3

FALSE

The second and third parts aren’t stated correctly

The second and third parts aren’t stated correctly

A subset H of a vector space V is a subspace of V if the following conditions are satised: (i) the zero vector of V is in

H, (ii)u, v and u + v are in H, and (iii) c is a scalar and cu is in H

H, (ii)u, v and u + v are in H, and (iii) c is a scalar and cu is in H

TRUE

The null space of A is the solution set of the equation Ax = 0.

FALSE

It’s R^n

It’s R^n

The null space of an m x n matrix is in R^m

TRUE

The column space of A is the range of the mapping x -> Ax.

FALSE

must be consistent for all b

must be consistent for all b

If the equation Ax = b is consistent, then Col A is R^m

TRUE

The kernel of a linear transformation is a vector space

TRUE

Col A is the set of a vectors that can be written as Ax for some x.

TRUE

The null space is a vector space

TRUE

The column space of an m x n matrix is in R^m

FALSE

It is the set of all b that have solutions

It is the set of all b that have solutions

Col A is the set of all solutions of Ax = b

TRUE

Nul A is the kernel of the mapping x -> Ax

TRUE

The range of a linear transformation is a vector space.

TRUE

The set of all solutions of a homogeneous linear differential equation is the kernel of a linear transformation.

FALSE

unless it is in the zero vector

unless it is in the zero vector

A single vector is itself linearly dependent

FALSE

They may not be linearly independent

They may not be linearly independent

If H = Span {b1,…,bn} then {b1,…,bn} is a basis for H

TRUE

The columns of an invertible n x n matrix form a basis for R^n

FALSE

it is too large, then it is no longer linearly independent

it is too large, then it is no longer linearly independent

A basis is a spanning set that is as large as possible.

FALSE

they are not affected

they are not affected

In some cases, the linear dependence relations among the columns of a matrix can be aected by certain elementary row

operations on the matrix

operations on the matrix

FALSE

it may not span

it may not span

A linearly independent set in a subspace H is a basis for H

TRUE

If a finite set S of nonzero vectors spans a vector space V, the some subset is a basis for V

TRUE

A basis is a linearly independent set that is as large as possible.

FALSE

it never fails!

it never fails!

The standard method for producing a spanning set for Nul A, described in this section, sometimes fails to produce a basis

FALSE

Must look at corresponding columns in A

Must look at corresponding columns in A

If B is an echelon form of a matrix A, then the pivot columns of B form a basis for Col A

…

4.4 15 a

…

4.4 15 b

…

4.4 15 c

…

4.4 16 a

…

4.4 16 b

…

4.4 16 c

TRUE

The number of pivot columns of a matrix equals the dimension of its column space

FALSE

unless the plane is through the origin

unless the plane is through the origin

A plane in R^3 is a two dimensional subspace of R^3

FALSE

It’s 5

It’s 5

The dimension of the vector space P4 is 4

FALSE

S must have exactly n elements

S must have exactly n elements

If dimV = n and S is a linearly independent set in V, then S is a basis for V

TRUE

If a set {v1…vn} spans a finite dimensional vector space V and if T is a set of more than n vectors in V, then T is linearly dependent

FALSE

Not a subset, as before

Not a subset, as before

R^2 is a two dimensional subspace of R^3

FALSE

It’s the number of free variables

It’s the number of free variables

The number of variables in the equation Ax = 0 equals the dimension of Nul A

FALSE

it must be impossible to span it by a finite set

it must be impossible to span it by a finite set

A vector space is infinite dimensional is it is spanned by an infinite set

FALSE

S must have exactly n elements or be noted as linearly independent

S must have exactly n elements or be noted as linearly independent

If dim V = n and if S spans V. then S is a basis for V

TRUE

The only three dimensional subspace of R^3 is R^3 itself

TRUE

The row space of A is the same as the column space of A^T

FALSE

The nonzero rows of B form a basis. The first three rows of A may be linear dependent.

The nonzero rows of B form a basis. The first three rows of A may be linear dependent.

If B is an echelon form of A, and if B has three nonzero rows, then the first three rows of A form a basis of Row A

TRUE

The dimensions of the row space and the column space of A are the same, even if A is not square

FALSE

Equals number of columns by rank theorem

Equals number of columns by rank theorem

The sum of the dimensions of the row space and the null space of A equals the number of rows in A

TRUE

On a computer, row operations can change the apparent rank of a matrix.

FALSE

It’s the corresponding columns in A

It’s the corresponding columns in A

If B is any echelon form of A, the the pivot columns of B form a basis for the column space of A

FALSE

for example, row interchanges mess things up

for example, row interchanges mess things up

Row operations preserve the linear dependence relations among the rows of A

TRUE

The dimension of null space of A is the number of columns of A that are not pivot columns

TRUE

The row space of A^T is the same as the column space of A

TRUE

If A and B are row equivalent, then their row spaces are the same