[v]*[v] = ||[v]||^2.

T

For any scalar c, [u]*(c[v]) = c([u]*[v]).

T

If the distance from [u] to [v] equals the distance from [u] to -[v], then [u] and [v] are orthogonal.

T

For a square matrix A, vectors in Col A are orthogonal to vectors in NulA.

F

If vectors [v]1, … , [v]p span a subspace W and if [x] is orthogonal to each [v]j for j = 1, … , p, then [x] is in W perp.

T

[u]*[v] – [v]*[u] = 0.

T

For any scalar c, ||c[v]|| = c||[v]||.

F

If [x] is orthogonal to every vector in a subspace W, then [x] is in W perp.

T

If ||[u]||^2 + ||[v]||^2 = ||[u]+[v]||^2, then [u] and [v] are orthogonal.

T

For an mxn matrix A, vectors in the null space of A are orthogonal to vectors in the row space of A.

T

Not every linearly independent set in Rn is an orthogonal set.

T

If [y] is a linear combination of nonzero vectors from an orthogonal set, then the weights in the linear combination can be computed without row operations on a matrix.

T

If the vectors in an orthogonal set of nonzero vectors are normalized, then some of the new vectors may not be orthogonal.

F

A matrix with orthonormal columns is an orthogonal matrix.

F

If L is a line through [0] and if y-hat is the orthogonal projection of [y] onto L, then ||y-hat|| gives the distance from [y] to L.

F

Not every orthogonal set in Rn is linearly independent.

T

If a set S = {[u]j, … , [u]p} has the property that [u]j*[u]j = 0 whenever i does not equal j, then S is an orthonormal set.

F

If the columns of an mxn matrix A are orthonormal, then the linear mapping [x] |-> A[x] preserves length.

T

The orthogonal projection of [y] onto [v] is the same as the orthogonal projection of [y] onto c[v] whenever c does not equal 0.

T

An orthogonal matrix is invertible.

T

If [z] is orthogonal to [u]1 and to [u]2 and if W = Span {[u]1, [u]2}, then [z] must be in W perp.

T

For each [y] and each subspace W, the vector [y] – the projection of [y] onto W is orthogonal to W.

T

The orthogonal projection y-hat of [y] onto a subspace W can sometimes depend on the orthogonal basis for W used to compute y-hat.

F

If [y] is in a subspace W, then the orthogonal projection of [y] onto W is [y] itself.

T

If the columns of an nxp matrix U are orthonormal, then U(U^t)[y] is the orthogonal projection of [y] onto the column space of U.

T

If W is a subspace of Rn and if [v] is in both W and W perp, then [v] must be the zero vector.

T

In the Orthogonal Decomposition Theorem, each term in formula (2) for y-hat is itself an orthogonal projection of [y] onto a subspace of W.

T

If [y] = [z]1 + [z]2, where [z]1 is in a subspace W and [z]2 is in W perp, then [z]1 must be the orthogonal projection of [y] onto W.

T

The best approximation to the [y] by elements of a subspace W is given by the vector [y] – the projection of [y] onto W.

F

If an nxp matrix U has orthonormal columns, then U(U^t)[x] = [x] for all [x] in Rn.

F

If {[v]1, [v]2, [v]3} is an orthogonal basis for W, then multiplying [v]3 by a scalar c gives a new orthogonal basis {[v]1, [v]2, c[v]3}.

F

The Gram-Schmidt process produces from a linearly independent set {[x]1, … [x]p} an orthogonal set {[v]1, … , [v]p} with the property that for each k, the vectors [v]1, … , [v]k span the same subspace as that spanned by [x]1, … , [x]k.

T

If A = QR, where Q has orthonormal columns, then R = (Q^t)A.

T

If W = Span {[x]1, [x]2, [x]3} with {[x]1, [x]2, [x]3} linearly independent, and if {[v]1, [v]2, [v]3} is an orthogonal set in W, then {[v]1, [v]2, [v]3} is a basis for W.

F

If [x] is not in a subspace W, then [x] – the projection of [x] onto W is not zero.

T

In a QR factorization, say A = QR (when A has linearly independent columns), the columns of Q form an orthonormal basis for the column space of A.

T

The general least-squares problem is to find an [x] that makes A[x] as close as possible to [b].

T

A least-squares solution of A[x] = [b] is a vector x-hat that satisfies A(x-hat) = b-hat, where b-hat is the orthogonal projection of [b] onto ColA.

T

A least squares solution of A[x] = [b] is a vector x-hat such that ||[b] – A[x]|| is less than or equal to ||[b] – A(x-hat)|| for all [x] in Rn.

F

Any solution of (A^t)A[x] = (A^t)[b] is a least-squares solution of A[x] = [b].

T

If the columns of A are linearly independent, then the equation A[x] = [b] has exactly one least-squares solution.

T

If [b] is in the column space of A, then every solution of A[x] = [b] is a least-squares solution.

T

The least-squares solution of A[x] = [b] is the point in the column space of A closest to [b].

F

A least-squares solution of A[x] = [b] is a list of weights that, when applied to the columns of A, produces the orthogonal projection of [b] onto ColA.

T

If x-hat is a least-squares solution of A[x] = [b], then x-hat = [((A^t)A)^-1](A^t([b].

F

The normal equations always provide a reliable method for computing least-squares solutions.

F

If A has a QR factorization, say A = QR, then the best way to find the least-squares solution of A[x] = [b] is to compute x-hat = (R^-1)(Q^t)[b].

F

The length of every vector is a positive number

F

A vector [v] and its negative, [-v] have equal lengths

T

The distance between u and v is ||u-v||

T

If r is any scalar, then ||rv||=r||v||

F

If two vectors are orthogonal, they are linearly independent

F

If [x] is orthogonal to both [u] and [v], then [x] must be orthogonal to [u]-[v]

T

If ||u+v||^2 = ||u||^2+||v||^2, then [u] and [v] are orthogonal

T

If ||u-v||^2 = ||u||^2+||v||^2, then [u] and [v] are orthogonal

T

The orthogonal projection of [y] onto [u] is a scalar multiple of y

F

If a vector [y] coincides with its orthogonal projection onto a subspace W, then [y] is in W

T

The set of all vectors in Rn orthogonal to one fixed vector is a subspace of Rn

T

If W is a subspace of Rn, then W and Wperp have no vectors in common

F

If {v1, v2, v3} is an orthogonal set, and if c1, c2, c3 are scalars, then {c1v1, c2v2, c3v3} is an orthogonal set

T

If a matrix U has orthonormal columns, UU^t=I

F

A square matrix with orthogonal columns is an orthogonal matrix

F

If a square matrix has orthonormal columns, then it also has orthonormal rows

T

If W is a subspace, then ||projection of [v] onto w||^2 + ||v-projection of [v] onto w||^2 = ||v||^2

T

A least squares solution Ax=b is the vector Ax-hat in Col A closest to b, so that ||b-Ax-hat|| ≤ ||b-Ax|| for all x

F

The normal equation for a least squares solution of A[x]=b are given by x-hat – (A^tA)^-1A^t[b]

F