FALSE

it’s the C-coordinate vectors of the vectors in the basis B

it’s the C-coordinate vectors of the vectors in the basis B

The columns of the change-of-coordinates matrix P C<---B are B-coordinate vectors of the vectors in C.

TRUE

If V= R^n and C is the standard basis for V, then P C<---B is the same as the change-of-coordinates matrix P_B introduced in Section 4.4.

TRUE

The columns of P C<---B are linearly independent.

FALSE

it satisfies [X]C = P[X]B

it satisfies [X]C = P[X]B

If V = R^2, B = {b1,b2}, and C = {c1,c2}, then row reduction of [c1 c2 b1 b2] to [I P] produces a matrix P that satisfies [x]_B = P[x]_C for all x in V.

FALSE

the vector has to be nonzero

the vector has to be nonzero

If Ax = λx for some vector x, then λ is an eigenvalue of A.

TRUE

A matrix A is not invertible if and only if 0 is eigenvalue of A.

TRUE

A number c is an eigenvalue of A if and only if the equation (A -cI)x = 0 has a nontrivial solution.

TRUE

Finding an eigenvector of A may be difficult, but checking whether a given vector is in fact an eigenvector is easy.

FALSE

Row reducing changes the eigenvectors and eigen valuse

Row reducing changes the eigenvectors and eigen valuse

To find the eigenvalues of A, reduce A to echelon form.

FALSE

the vector must be nonzero

the vector must be nonzero

If Ax = λx for some scalar λ, then x is an eigenvector of A.

FALSE

the converse is true

the converse is true

If v1 and v2 are linearly independent eigenvectors, then they correspond to distinct eigenvalues.

TRUE

A steady-state vector for a stochastic matrix is actually an eigenvector.

FALSE

triangular matrices

triangular matrices

The eigenvalues of a matrix are on its main diagonal.

TRUE

An eigenspace of A is a null space of a certain matrix.

FALSE

A is triangular

A is triangular

The determinant of A is the product of the diagonal entries in A.

FALSE

interchanging rows and multiply a row by a constant changes the determinant.

interchanging rows and multiply a row by a constant changes the determinant.

An elementary row operation on A does not change the determinant.

TRUE

(det A)(det B) = det AB

FALSE

-5 is an eigenvalue

-5 is an eigenvalue

If λ + 5 is a factor of the characteristic polynomial of A, then 5 is an eigenvalue of A.

FALSE

absolute value of determinant

absolute value of determinant

If A is 3 x 3, with columns a1, a2, a3, then det A equals the volume of the parallelepiped determined by a1, a2, a3.

FALSE

detA^T = det A

detA^T = det A

det A^T = (-1)detA

TRUE

The multiplicity of a root r of the characteristic equation of A is called the algebraic multiplicity of r as an eigenvalue

of A.

of A.

FALSE

row ops may change eigenvalues

row ops may change eigenvalues

A row replacement operation on the square matrix A does not change the eigenvalues.

FALSE

D must be a diagonal matrix

D must be a diagonal matrix

A is diagonalizable if A = PDP^-1 for some matrix D and some invertible matrix P.

TRUE

If R^n has a basis of eigenvectors of A, then A is diagonalizable.

FALSE

always has n eigenvalues, counting multiplicity

always has n eigenvalues, counting multiplicity

A is diagonalizable if and only if A has n eigenvalues, counting multiplicities.

FALSE

A can be diagonalizable and not invertible

A can be diagonalizable and not invertible

If A is diagonalizable, then A is invertible.

FALSE

the eigenvectors have to linearly independent

the eigenvectors have to linearly independent

A is diagonalizable if A has n eigenvectors.

FALSE

converse is true

converse is true

If A is diagonalizable, then A has n distinct eigenvalues.

TRUE

If AP = PD, with D diagonal, then the nonzero columns of P must be eigenvectors of A.

FALSE

these are not directly related

these are not directly related

If A is invertible, then A is diagonalizable.

TRUE

v . v = IIvII^2

TRUE

For any scalar c, u . (cv) = c(u . v)

TRUE

If the distance form u to v equals the distance from u to -v, then u and v are orthogonal.

FALSE

[1 1

0 0]

[1 1

0 0]

For a square matrix A, vectors in Col A are orthogonal to vectors in Nul A.

TRUE

If vectors v1,…,vp span a subspace W and if x is orthogonal to each vj for j = 1,…,p, then x is in W^perp

TRUE

u . v – v . u = 0

FALSE

need absolute value of c

need absolute value of c

For any scalar c, IIcvII = cIIvII

TRUE

If x is orthogonal to every vector in a subspace W, then x is in W^perp.

TRUE

If IIuII^2 + IIvII^2 = IIu + vII^2, then u and v are orthogonal.

TRUE

***(end of hw T/F)

***(end of hw T/F)

For an m x n matrix A, vectors in the null space of A are orthogonal to vectors in the row space of A.

TRUE

***(BB start)

***(BB start)

If B and C are bases for a vector space V , then the columns of the change of

coordinates matrix from B to C are linearly independent.

coordinates matrix from B to C are linearly independent.

FALSE

vector x does not equal 0 vector

vector x does not equal 0 vector

If A is an n x n matrix and Ax = λx for some scalar λ, then x is an eigenvector

of A.

of A.

FALSE

triangular matrix

triangular matrix

The eigenvalues of a square matrix are the scalars on its main diagonal.

FALSE

eg [1 0 OR could have 0 eigenvalue

0 0]

eg [1 0 OR could have 0 eigenvalue

0 0]

If the square matrix A is diagonalizable, then A is invertible

FALSE

n LI eigenvectors

n LI eigenvectors

An n x n matrix A is diagonalizable if and only if A has n eigenvalues, counting

multiplicities.

multiplicities.

TRUE

If A and P are square matrices and AP = P D, with D diagonal, then the nonzero

columns of P must be eigenvectors of A

columns of P must be eigenvectors of A

TRUE

If vectors v1,…, vp span a subspace W of R^n and if x is orthogonal to each vj

for j = 1,…, p, then x is in the orthogonal complement of W.

for j = 1,…, p, then x is in the orthogonal complement of W.

FALSE

IIc v(vector)II = IcI IIv(vector)II

IIc v(vector)II = IcI IIv(vector)II

For any scalar c and vector v in R^n

, IIcvII = cIIvII.

, IIcvII = cIIvII.

TRUE

Not every orthogonal set in R^n

is linearly independent

is linearly independent

FALSE

give by proj_w y(vector)

***(BB end)

give by proj_w y(vector)

***(BB end)

The best approximation to a vector y in R^n by elements of a subspace W of R^n

is given by the vector y – proj_w y

is given by the vector y – proj_w y

FALSE

unique

unique

The orthogonal projection y(hat) of a vector y in R^n onto a subspace W of R^n can sometimes depend on the orthogonal basis for W used to compute y(hat)

TRUE

The dimensions of the row space and the column space of A are the same, even if A is not square.

FALSE

C-coords of the vectors in B

C-coords of the vectors in B

If B and C are bases for a vector space V, then the columns of the change-of-coordinates matrix from basis B to C are the B-coordinate vectors of the vectors in C.

FALSE

[x]c = P[x]B

[x]c = P[x]B

If B = {b1,b2} and C = {c1,c2} are two baes for R^2, then row reduction of [c1 c2 b1 b2] to [I P] produces a matrix P that satisfies [x]B = P[x]C for all x in R^2.

TRUE

A number c is an eigenvalue of a square matrix A if and only if the equation (A – cI)x = 0 has a nontrivial solution.

FALSE

one eigenvalue may have 2 or more LI eigenvectors

one eigenvalue may have 2 or more LI eigenvectors

If v1 and v2 are LI eigenvectors for some matrix A, then they correspond to distinct eigenvalues

TRUE

If R^n has a basis of eigenvectors of an n x n matrix A, then A is diagonalizable.

TRUE

If A, P, D are square matrices with D diagonal that satisfy AP = PD, then the nonzero columns of P must be eigenvectors of A.