Linear Algebra Quizlet

A directed line segment that corresponds to a displacement from one point A to another point B
Vector
Initial point of vector
Tail
We will write a custom essay sample on
Linear Algebra Quizlet
or any similar topic only for you
Order now
2 vectors that are scalar multiples of each other
Parallel vectors
A vector v is a linear combination of vectors v1, v2,…,vk if there are scalars c1,c2,…ck such that v=c1v1+c2v2+…+ckvk. The scalars c1,c2,…,ck are called the coefficients of the linear combination
Linear combination
Aka dot product
Scalar product
Length. llvll = √v.v
Norm of a vector
vector of length 1
Unit vector
finding a unit vector in same direction
normalizing
d(u,v)=llu-vll (can switch u and v) – u=(a,b) and v=(a1, b1): find the norm ‖u-v‖=√((a-a_1 )^2+〖(b-b_1)〗^2 )
Formula to find distance between vectors u and v
cos(theta) = (u.v)/(llullllvll)
Formula for angle between 2 vectors
Two vectors u and v are orthogonal if u.v=0
Orthogonal vectors
(u.v.)/(u.u) * u
Formula for projection of v onto u
Vector n that is orthogonal to any vector x that is parallel to the line
Normal vector to line
n.x = n.p (p is a specific point)
Normal form of line
ax+by=c
General form of equation of line
x=p+td (p is a specific point on line, d is direction vector)
vector equation of a line
|ax0+by0-c|/√(a^2+b^2 )
Formula for distance from point to line
|ax0+by0+cz0-d|/√(a^2+b^2+c^2 )
Formula for distance from point to plane
axb=(a2b3-a3b2, a3b1-a1b3, a1b2-b1a2)
Cross product formula
commutative, associative with addition and multiplication, distributive (can distribute scalar over vector sum or vector over scalar sum)
Algebraic properties of vectors in Rn
u.v=v.u
u.(v+w)=u.v + u.w
(cu).v =c(u.v)
u.u≥0 and u.u=0 iff u=0
Properties of vector dot products
Only when v=0
When does llvll=0?
lcl llvll
llcvll =
lu.vl≤ llull llvll
Cauchy-Schwarz inequality
Opposite
If the angle between u and v is obtuse, then proju(v) will be in the same/opposite direction from u?
negative scalar multiple
If the angle between u and v is obtuse, proju(v) will be a ___ of u
direction vector
Parallel planes have the same __
u
Proju(v) is a scalar multiple of vector __
u and v are orthogonal
llu+vll = llu-vll iff
proju(v)
proju(proju(v)) =
no, infinite
Are vector and parametric forms of equation of line unique?
2-D
A plane is an x-D object
If (GCF of a,m) / b doesn’t have remainder, then # solutions if GCF (if remainder then no solutions)
How to determine how many solutions ax=b (mod m) has
Can do scalar addition with because scalar
What to remember about dot product (u.v)
sin(theta) = (u.v)/llullllvll
Formula to find angle between plane and line
An equation that can be written in form a1x1+a2x2+…+anxn where a1,a2,…,an,b are constants
Linear equation
In each nonzero, first nonzero entry is in column to left of any leading entries below it. All entries in a column below a leading entry (not necess. 1) are 0.
Row echelon form
Can do to matrices without changing solutions: 1. interchange 2 rows 2. multiply a row by a constant 3. add a multiple of one row to another row
Elementary row operations
to get ref
Gaussian elimination
If can perform row operations to make one look like other. Doesn’t mean they’re equivalent matrices
Row equivalent
If same rref
Easy way to determine if 2 matrices are row equivalent
The rank of a matrix is the number of non-zero rows in its row echelon form OR the # of leading variables OR (# variables)-(# free variables)
Rank – 3 definitions
Same
Rank of matrix vs. rank of its transpose
max: 3, min:0
What are max and min ranks of 5×3 matrix?
rank(A)
The row vectors of mxn matrix A are linearly dependent iff rank(A) ? n
In system of n variables, number of free variables = n-rank
Rank Theorem
All constant terms are 0
Homogenous system
line
If 1 direction vector, then equation of a:
plane
If 2 direction vectors, then equation of a:
Put into augmented matrix and consistent (meaning no 0 on left and nonzero on right). Then must make sure to plug into all equations, could be no solution…
How to tell if something is a linear combination?
If there’s at leas tone solution
Consistent
If s={v1,v2,…,vk} is a set of vectors in Rn, then the set of all linear combinations of v1,v2,…,vk is called the span of v1,v2,…,vk and is written (v1,v2,..,vk) or span(s)
Span
Means you can get to any point in R2 using linear combinations of u and v. Make x(u) + y(V)=[a,b] then create augmented matrix and rref.
To show that span(s)=R^2
At least 3 vectors
How many vectors do you need for a set that spans R3
A set of vectors v1,v2,…,vk is linearly dependent if there are scalars c1,c2,…,ck at least one of which is not zero, such that c1v1+c2v2+…+ckvk=0
Linear dependence
If have any zero rows, then linearly dependent
Relationship between zero rows and linear dependence
create a zero row
The rows of a matrix will be linearly dependent if elementary row operations can be used to:
Independent
If x=y=z=0, are vectors linearly dependent/independent?
Independent
Is the empty set linearly dependent/independent?
m>n
Any set of m vectors in Rn is linearly independent if:
1. Find scalars such that c1v1+…+ckvk=0 (one constant has to be non-zero)
2. Show that at least one of the vectors can be expressed as a linear combo of the others
3. Let v1,…,vn be column vectors in Rn and let A be the matrix with these vectors as columns. Show that augmented matrix [Al0] has nontrviail solution
3 ways to prove linear dependence
1. Show that c1v1+…+ckvk=0 has only trivial solution
2. Show [Al0] has only the trivial solution
2 ways to prove linear independence
matrix with 0s everywhere but diagonal, can also have 0s in diagonal
Diagonal matrix
special case of diagonal matrix, all entries in diagonal are same number
scalar matrix
exactly the same incl size and entries
equal matrices
when A=A^T
symmetric matrix
the transpose of an m x n matrix A is the n xm matrix A^T obtained by interchanging the rows and columns of A
transpose
A^r+s
(A^r)(A^s)=
A^rs
(A^r)^s=
All entries below main diagonal are 0
Upper triangular matrix
commutative, associative, distributivity over matrix and scalar addition
Properties of Addition and scalar multiplication
Ax=0
Homogeneous linear system
associative, distributes over addition, can commute the scalar – NOT commutative
Properties of matrix multiplication
k(A^T)
(kA)^T=
A^T + B^T
(A+B)^T=
(B^T)(A^T)
(AB)^T=
(A^T)^r
(A^r)^T =
If A is an nxn matrix, an inverse of A is an nxn matrix A’ with the property that AA’=I and A’A=I where I=In is the nxn identity matrix. If such an A’ exists, then A is called invertible
Inverse of a matrix
unique
If A is an invertible matrix, then its inverse is:
x=A^-1 * b for any b in Rn
If A is an invertible nxn matrix, then the system of linear equations given by Ax=b has the unique solution:
(1/ad-bc)[d -b/-c a]
Formula for inverse of 2×2 matrix
(1/c)A^-1
(cA)^-1=
B^-1 * A^-1
(AB)^-1=
(A^-1)^T
(A^T)^-1=
(A^-1)^n
(A^n)^-1 =
no because not commutative
Can you distribute for (A^-1B^3)^2 ?
can be made by performing one elementary row operation to identity matrix
Elementary matrices
invertible
Each elementary matrix is:
A^-1
If a sequence of elementary row operations reduces A to I then the same sequence of row operations transform I into:
A subspace of Rn is any collection S of vectors in Rn such that: 1. the zero vector is in s. 2. if u and v are in s, then u+v is in S. 3. If u is in S and c is a scalar, then cu is in S.
Subspace
one of the 3 stipulations has to fail
How to prove that something isn’t a subspace?
a point (origin), a line through origin, R2 (the whole thing)
What are possible subspaces in R2?
The rowspace of A is the subspace row(A) of Rn spanned by the rows of A
Row space
not the row space but DO change column space
Do elementary rows change the row space? column space?
The column space of A is the subspace col(A) of Rm spanned by columns of A
column space
Must be equal
Relationship between # of vectors in row and column spaces
place in augmented matrix with A. If consistent, means can find linear combos so vector is in col(A)
To determine if a vector is in col(A)
augment with A and see if consistent (note: can take transpose to make more familiar)
To determine if vector is in row(A)
Let A be an mxn matrix. The nullspace of A is the subspace of Rn consisting of solutions of the homogenous linear system Ax=0. It is denoted null(A)
Nullspace
yes, even if just rivial solution
Does every set of vectors have a nullspace? explain
A. There is no solution
B. There is a unique solution
C. There are infinitely many solutions
For any system of linear equations Ax=b, what are the 3 options for results?
A basis for a subspace S of Rn is a set of vectors in S that: 1. spans S and 2. is linearly independent
Basis
always be the same
The number of vectors for a given subspace will
vectors that are scalar multiples/extraneous
Basis gets rid of:
rref matrix. Use nonzero row vectors of R (with leading 1s) to form basis for row(A)
How to find row space basis
rref matrix. Use column vectors of A corresponding to leading 1s to form basis for col(A)
How to find column space basis
rref matrix and augment with 0. Create x=,y=, z= etc. equations, all in terms of 1 free variable (must have as many equations as have variables!) – don’t forget z=z is an equation
How to find null space basis
free variable
Null space gets 1 vector for every:
row space vectors
Null space vectors are orthogonal to:
#columns in original matrix
(# null space vectors)+(#row space vectors OR #column space vectors)=
vectors
If S is a subspace of Rn, any 2 bases for S have the same # of:
If S is a subspace of Rn, then the number of vectors in a basis for S is called the dimension of S, denoted dimS
Dimension
zero vector
What is always a subspace of Rn?
Linearly dependent, so can’t have a basis
What 2 things are true of any set containing zero vector?
dimension
Row and column spaces of matrix have the same __
rank(A)
rank(A^T)=
The nullity of a matrix A is the dimension of its null space. In other words, dimension of solution of Ax=0, which is same as # free variables in the solution
Nullity
If A is an mxn matrix, then rank(A)+nullity(A)=n
The Rank Theorem
A transformation T:Rn–>Rm is called a linear transformation if: 1. t(u+v)=t(u)+t(v) for all u and v in Rn. 2. t(cv)=ct(v) for all u in Rn and c in R
Linear transformation
linear transformation
A vector multiplied by a matrix is a:
let u=[x1,y1] and v=[x2,y2] and show that 1.t(u+v) and t(u)+t(V) are equal, and 2. t(cv) and c(tv) are equal
how to show something is a linear ttransformation:
find one counterexample
how to prove that something’s not a linear transformation
reflection, rotation, dilation
Give 3 examples of linear transformations
translation because origin doesn’t get mapped to itself
give one example of not linear transformation, and why
take identity matrix and divide into 2 columns, treat each as point and see where would be mapped to
How to find matrix for linear transformation
[costheta -sintheta/sintheta costheta]
linear transformation matrix for rotation
Let A be an nxn matrix. A scalar λ is called an eigenvalue of A if there is a nonzero vector x such that Ax=λx
Eigenvalue
Let A be an nxn matrix. A vector x is called an eigenvector of A corresponding to λ if there is a nonzero vector x such that Ax=λx
Eigenvector
eigenvector and has same eigenvalue
Any scalar multiple of an eigenvector is also an ___ and has the same ___
2
Max # of possible eigenvectors for 2×2 matrix
Let A be an nxn matrix and let λ be an eigenvalue of A. The collection of eigenvectors corresponding to λ, together with the zero vector, is called the eigenspace of λ and is denoted by Eλ
Eigenspace
Multiply v by A to get [xa, xb] matrix where x is eigenvalue
How to show a vector is an eigenvector of A
(A-Iλ)x=0 then rref and create variable equations to find eigenvector
How to show that λ is an eigenvalue of A and find one eigenvector corr to eigenvalue
det(A-λI) to ind char polynomial, then set equal to 0 to find eigenvalues. (A-λI) for different eigenvalues to find eigenvectors/spaces
Find all eigenvalues and eigenvectors of A
ad-bc
Determinant of 2×2 matrix
scalar
Determinant is a __
the product of the entries on its main diagonal
The determinant of a triangular matrix is:
If A has a zero row or column, then detA=
-detA
If B is obtained by interchanging 2 rows or columns of A, then detB=
If A has 2 identical rows or columns, then detA=
kdetA
If B is obtained by multiplying a row or column of A by a constant k, then detB=
detA
If B is obtained by adding a multiple k of one row or column of A to another, then detB=
(k^n)det(A)
det(kA)=
det(AB)
If A and B are nxn matrices, then det(A)*det(B)=
1/(det(A))
If A is invertible, then det(A^-1)=
det(A)
For any square matrix A, detA^T=
find determinant in terms of k to find char polynomial. k can’t equal any values for which char polynomial would equal 0
For what values of k (components within marix) is A invertible? – how to solve
det(A-λI)
Characteristic polynomial of A
det(A-λI)=0
Characteristic equation of A
multiplicity of a root of characteristic polynomial
Algebraic multiplicity
number of vectors in a basis for an eigensapce
geometric multiplicity
entries on its main diagonal
Eigenvalues for a triangular matrix are:
0 isn’t an eigenvalue of A
A square matrix is invertible iff
A. A is invertible
B. Ax=b has a unique solution for every b in Rn
C. Ax=0 has only the trivial solution
D. The rref of A is In
E. A is a product of elementary matrices
F. rank(A)=n
G. nullity(A)=0
H. The column vectors of A are linearly independent
I. The column vectors of A span Rn
J. The column vectors of A form a basis for Rn
K. The row vectors of A are linearly independent
L. The row vectors of A span Rn
M. The row vectors of A form a basis for Rn
N. detA≠0
O. 0 is not an eigenvalue of A
Fundamental Theorem of Invertible Matrices
linearly independent
If A is an nxn matrix and λ1,λ2,…,λm are distinct eigenvalues of A with corresponding eigenvectors v1,v2,..,vm then v1,.v2,…,vm are:
Let A and B be nxn matrices. A is similar to B if there is an invertible matrix P such that (P^-1)AP=B
Similar matrices
AP=PB, don’t have to find P^-1 for similarity problems
What’s an equivalent equation to (P^-1)AP=B that can be helpful
A
If A~B then B~
A~C
If A~B and B~C then:
detB
If A~B, then detA=
B is invertible
If A~B, then A is invertible iff
rank, char polynomial, eigenvalues
If A~B, then A and B have the same: (3 answers)
B^m for all integers m≥0
If A~B, then A^m ~
B^m for all integers
If A~B, then if A is invertible, A^m ~
Showing two matrices aren’t similar bc can’t be if properties fail
What are properties of similar matrices most helpful for?
First detA≠detB. If not, then A and B don’t have same char polynomial
What are 2 ways to immediately tell matrices aren’t similar
An nxn matrix is diagonalizable if there is a diagonal matrix D such that A is similar to D – that is, if there’s an invertible nxn matrix P such that (P^-1)AP=D
Diagonalizable
matrix where columns are eigenvectors
What is P in (P^-1)AP=D diagonalization equation?
matrix where diagonal entries are eigenvalues
What is D in (P^-1)AP=D diagonalization equation?
n
If A is an nxn matrix, A is diagonalizable iff A has __ linearly independent eigenvectors.
no
If 3×3 matrix and 2 eigenvectors, is A diagonalizable?
diagonalizable
If A is an nxn matrix with n distinct eigenvalues, then A is:
geometric mult less than or equal to algebraic mult
If A is an nxn matrix, what is relationship between geometric and algebraic multiplicity of each eigenvalue?
Algebraic multiplicity of each eigenvalue equals its geometric multilicty
What is true of algebraic and geometric multiplicities of diagonalizable matrices?
columns
Not diagonalizable when number of eigenspaces doesn’t correspond with number of:
[a^n 0/0 b^n]
How to raise diagonal matrix to a power [a 0/0 b]^n=
λ^3=λ
λ^3-λ=0, then solve for λ
What are possible eigenvalues of A if A^3=A?
A set of vectors {v1,v2,…,vk} in Rn is called an orthogonal set if all distinct pairs of vectors are orthogonal
Orthogonal set
linearly independent
If {v1,v2,…,vk} is an orthogonal set of nonzero vectors in Rn, then these vectors are:
An orthogonal basis for a subspace W of Rn is a basis of W that is an orthogonal set
Orthogonal basis
Must show that every pair of vectors from this set is orthogonal, or that v1.v2=0, v2.v3=0, v3.v1=0
How to show that a set of vectors is an orthogonal set given v1,v2,v3
(w.vi)/(vi.vi) for i=1,…,k
Let {v1,v2,…,vk} be an orthogonal basis for a subspace W of Rn and let w be any vectors in W. Then the unique scalars c1,…,ck such that w=c1v1+…+ckvk are given by ci=
A set of vectors in Rn is an orthonormal set if it is an orthogonal set of unit vectors
Orthonormal set
An orthonormal basis for a subspace W of Rn is a basis of W that is an orthonormal set
Orthonormal basis
An nxn matrix Q whose columns form an orthonormal set is called an orthogonal matrix
Orthogonal matrix
Q^T
A square matrix Q is orthogonal iff Q^-1=
an orthonormal set
If Q is an orthogonal matrix, then its rows form:
is orthogonal
If Q is orthogonal, then Q^-1
+/- 1
If Q is orthogonal, then detQ=
=1
If Q is orthogonal, then if λ is an eigenvalue of Q, then lλl:
Q1Q2
If Q1 and Q2 are orthogonal nxn matrices, then so is:
If v is orthogonal to every vector in W
When can you say a vector v is orthogonal to a subpsace W?
The set of all vectors that are orthogonal to W is called the orthogonal complement of W, denoted w(perp)
Orthogonal complement
a subspace of Rn
If W is a subspace of Rn, then W(perp) is:
W
(Wperp)perp =
nullspace of A
Orhotongal complement of rowspace of A is:
the nullspace of A^T
Orthogonal complement of column space of A is:
rowspace, column space, nullspace, nullspace of A^T
what are the 4 fundamental subspaces of A?
Let w be a subspace of Rn and let {u1,u2,…,uk} be an orthogonal basis for W. For any vector v in Rn, the orthogonal projection of v onto w is defined as projw(V) = ((u1.v)/(u1.u1)u1) + ((u2.v)/(u2.u2)u2)+…+((uk.v)/(uk.uk))uk
Orthogonal projection of v onto w
perpw(v) = v-projw(v)
Component of v orthogonal to w is the vector:
v-projw(v)
perpw(v) =
w+w(perp) = projw(v) + perpw(v)
Orthogonal Decomposition Theorem: v=
v1=x1
v2=x2-projv1(x2)
v3=x3-projv1(x3) – projv2(x3)
Gram-Schmidt Algorithm
Make any basis (with 3 linearly independent vectors). Then make perpendicular using Gram-Schmidt
Method for finding orthogonal basis for R3 that contains a given vector
A square matrix A is orthogonally diagonalizable if there exists an orthogonal matrix Q and a diagonal matrix D such that (Q^T)AQ=D
Orthogonally diagonalizable
If matrix is symmetric, it’s orthogonally diagonalizable
Relationship between symmetric and orthogonally diagonalizable matrix:
(Q^-1)AQ=D
What’s an equivalent statement for (Q^T)AQ=D?
v is a vector space (and its elements are called vectors) if:
1. closed for addition
2. commutativity works for addition
3. associative for addition
4. u+0=u
5. u+(-u)=0
6. closed for scalar multiplication
7. associative for scalar multiplication
8. distributive over scalar multiplication
9. distributive over scalar addition
10. identity (1u=u)
Vector space (what 10 things must be true)
6-10
When looking for axiom that makes vector space fail and addition is same as usual, which ones to focus on?
1. if u and v are in w, then u+v in w
2. if u in w and c is scalar, then cu is in w
w is a subspace of vectorspace v iff what 2 things are true:
×

Hi there, would you like to get such a paper? How about receiving a customized one? Check it out