a system of linear equations has either one solution or infinitely many solutions

consistent

no solution

inconsistent

1. augment the matrix

2. reduce to triangular form using row operations

2. reduce to triangular form using row operations

determine if a system of linear equations is consistent or inconsistent

a location in matrix A that corresponds to a leading 1 in the reduced echelon form of A

pivot position

use the free variables as the parameters for describing a solution set

parametric description of a solution set

the set of all vectors with two entries

R2

vectors in R3 are 3×1 column matrices with three entries

R3

given vectors v1, v2, …, vp in Rn and scalars c1, c2, …, cp, the vector y is called a linear combination and is defined by y = c1v1 + … + cpvp

linear combination (y)

1. augment the matrix to [a1 a2 b]
2. if solution is consistent, b is a linear combination of a1 and a2

determine wether b is a linear combination of a1 and a2

the span of {v1,…,vp} is a set of all linear combinations of v1…vp

must contain the zero vector

must contain the zero vector

span

1. determine whether the vector equation x1v1 + x2v2 + … + xpvp = b

2. equivalently, determine whether the augmented matrix [v1 … vp b] has a solution

2. equivalently, determine whether the augmented matrix [v1 … vp b] has a solution

determine if b is in span{v1,…,vp}

Ax is defined only if the number of columns of A equals the number of entries in x

matrix multiplication

Ax = b

matrix equation

x1a1 +x2a2 + … + xnan = b

vector equation

exists if and only if b is a linear combination of the columns of A

solution of Ax = b

let A be an mxn matrix

1. for each b in Rm, Ax = b has a solution

2. each b in Rm is a linear combination of the columns of A

3. the columns of A span Rm

4. A has a pivot position in every row

1. for each b in Rm, Ax = b has a solution

2. each b in Rm is a linear combination of the columns of A

3. the columns of A span Rm

4. A has a pivot position in every row

coefficient matrix theorem

1. can be written in the form Ax = 0 where A is an mxn matrix and 0 is the zero vector in Rm

2. always has at least one solution

3. Ax = 0 has a nontrivial solution if and only if the equation has at least one free variable

2. always has at least one solution

3. Ax = 0 has a nontrivial solution if and only if the equation has at least one free variable

homogenous linear system

the zero solution (for Ax = 0, x = 0)

trivial solution

a nonzero vector x that satisfies Ax = 0

nontrivial solution

1. let A be the matrix of coefficients of the system

2. row reduce the augmented matrix [A 0] to echelon form

3. determine if a free variable exists

4. to describe the solution set, continue row reduction to reduced echelon form

2. row reduce the augmented matrix [A 0] to echelon form

3. determine if a free variable exists

4. to describe the solution set, continue row reduction to reduced echelon form

determine if the homogenous system has a nontrivial solution

x = su + tv (s,t in R)

parametric vector equation

the general solution (many solutions) can be written in parametric vector form as one vector plus an arbitrary linear combination of vectors that satisfy the corresponding homogenous system

solutions of nonhomogenous systems

1. perform row operations on [A b]
2. express the solution in terms of the free variables

describe all solutions of Ax = b

1. a set of vectors {v1,…,vp} in Rn is said to be linearly undefended if the vector equation x1v1 + x2v2 + … + xpvp = 0 has only the trivial solution (x=0)

2. pivot in every column

3. one-to-one

2. pivot in every column

3. one-to-one

linearly independent

1. the set of vectors {v1,…,vp} is linearly dependent if there exist weights c1,…,cp (not all zero) such that c1v1 + c2v2 + … + cpp = 0

2. a nontrivial solution exists (at least one free variable)

2. a nontrivial solution exists (at least one free variable)

linearly dependent

1. A = [v1 v2 v3]
2. augment the matrix to [A 0] and row reduce

3. if there is at least one free variable, the set is NOT linearly independent

3. if there is at least one free variable, the set is NOT linearly independent

determine if the set {v1, v2, v3} is linearly independent

1. completely row reduce the augmented matrix and write the new system of linear equations

2. choose any nonzero value for the free variables

3. substitute into equation 1 (pg 56)

2. choose any nonzero value for the free variables

3. substitute into equation 1 (pg 56)

find a linear dependence relation among v1, v2, and v3

the columns of matrix A are linearly independent if and only if the equation Ax = 0 has only the trivial solution (no free variables)

linear independence of matrix columns

1. augment the matrix to [A 0]
2. row reduce

3. if there are no free variables, the columns of A are linearly independent

3. if there are no free variables, the columns of A are linearly independent

determine if the columns of matrix A are linearly independent

1. check whether at least one of the vectors is a scalar times the other (only applies to sets of two vectors)

2. if so, the vectors are linearly dependent

3. if not, the vectors are linearly independent

2. if so, the vectors are linearly dependent

3. if not, the vectors are linearly independent

determine if the sets of vectors are linearly independent

1. a set of two or more vectors is linearly dependent if and only if at least of of the vectors is a linear combination of the others

2. if a set contains more vectors than there are entries in each vector, then the set is linearly dependent

3. if a set contains the zero vector, then the set is linearly dependent

2. if a set contains more vectors than there are entries in each vector, then the set is linearly dependent

3. if a set contains the zero vector, then the set is linearly dependent

linear dependency of sets of two or more vectors

1. check to see if the number of vectors is greater than the number of entries in each vector

2. check to see if the set has the zero vector

3. compare the corresponding entries of the two vectors (determine if one is a multiple of the other)

2. check to see if the set has the zero vector

3. compare the corresponding entries of the two vectors (determine if one is a multiple of the other)

determine if the set of vectors is linearly dependent

a square n x n matrix whose non diagonal entries are zero

diagonal matrix

the sum of A + B is defined only when A and B are the same size

matrix addition

1. the number of columns of A must match the number of rows in B in order for a linear combination such as Ab1 to be defined

2. AB has the number of rows as A and the number of columns of B

3. AB is the matrix A times each column of B –> AB = [Ab1 Ab2 … Abp] 4. does not matter how we group the matrices when computing the product, as long as the left-to-right order of the matrices is preserved

5. AB usually does not equal BA

2. AB has the number of rows as A and the number of columns of B

3. AB is the matrix A times each column of B –> AB = [Ab1 Ab2 … Abp] 4. does not matter how we group the matrices when computing the product, as long as the left-to-right order of the matrices is preserved

5. AB usually does not equal BA

matrix multiplication

1. given an m x n matrix, the transpose of A (A^T) is the n x m matrix whose columns are formed from the corresponding rows of A

2. (A^T)^T = A

3. (A+B)^T = A^T + B^T

4. (AB)^T = B^T*A^T

2. (A^T)^T = A

3. (A+B)^T = A^T + B^T

4. (AB)^T = B^T*A^T

the transpose of a matrix

1. write down theorem 4 from pg 103

2. if ad-bc = 0, then A is not invertible

3. det A = ad-bc

4. if A is an invertible square matrix, then for each b in Rn, the equation Ax = b has the unique solution x = (A^-1)*b

2. if ad-bc = 0, then A is not invertible

3. det A = ad-bc

4. if A is an invertible square matrix, then for each b in Rn, the equation Ax = b has the unique solution x = (A^-1)*b

the inverse of a matrix

1. find the inverse of A

2. multiply by the b (the right hand side of the equals sign)

2. multiply by the b (the right hand side of the equals sign)

use the inverse of the matrix A to solve the system of linear equations

write down theorem 6 from pg 105

properties of an invertible matrix

1. augment the matrix to [A I]
2. row reduce to reduced echelon form

algorithm for finding A^-1

if…

1. A is a square matrix

2. A can be reduced to an I matrix

3. the number of pivot positions = number of rows or number of columns, A is invertible

4. Ax = 0 only has the trivial solution, A is invertible

5. the columns of A are linearly independent, A is invertible

6. the equation Ax = b has exactly one solution

7. the columns of A span Rn

8. A^T is an invertible matrix

then A is invertible

1. A is a square matrix

2. A can be reduced to an I matrix

3. the number of pivot positions = number of rows or number of columns, A is invertible

4. Ax = 0 only has the trivial solution, A is invertible

5. the columns of A are linearly independent, A is invertible

6. the equation Ax = b has exactly one solution

7. the columns of A span Rn

8. A^T is an invertible matrix

then A is invertible

use the Invertible Matrix Theorem to decide if A is invertible

a subspace of Rn is any set H in Rn that has three properties:

1. the zero vector is in H

2. for each u and v in H, the sum u+v is in H

3. for each u in H and each scalar c, the vector cu is in H

1. the zero vector is in H

2. for each u and v in H, the sum u+v is in H

3. for each u in H and each scalar c, the vector cu is in H

subspace

if v1 and v2 are in Rn and H = span{v1, v2}, then H is a subspace of Rn

subspace and span

1. the column space of a matrix A is the set Col A of all linear combinations of the columns of A

2. if A = [a1 … an] then Col A is the same as Span{a1…an}

2. if A = [a1 … an] then Col A is the same as Span{a1…an}

column space

1. augment the matrix to [A b]
2. row reduce

3. if solution is consistent, b is in Col A

3. if solution is consistent, b is in Col A

determine whether b is in the column space of A

1. the null space of a matrix A is the set Nul A of all solutions of the homogenous equation Ax = 0

2. when A has n columns, the solutions of Ax = 0 belong to Rn and the null space of A is a subset ofRn

3. the null space of an m x n matrix is a subspace of Rn

2. when A has n columns, the solutions of Ax = 0 belong to Rn and the null space of A is a subset ofRn

3. the null space of an m x n matrix is a subspace of Rn

null space

a basis for a subspace H of Rn is a linearly independent set in H that spans H

basis for a subspace

1. augment the matrix to [A 0]
2. row reduce

3. write the solution of Ax = 0 in parametric vector form

4. express the solution in terms of the free variables

5. u, v, w are a basis for Nul A

3. write the solution of Ax = 0 in parametric vector form

4. express the solution in terms of the free variables

5. u, v, w are a basis for Nul A

find a basis for the null space of the matrix A

1. look this up

find a basis for the column space of the matrix A

definition pg 154

coordinate systems

dim H is the number of vectors in any basis for H

dimension of a subspace

the rank is the dimension of the column space of A

rank

dim Nul A = the number of free variables in Ax = 0

determine the dimension of Nul A

if a matrix A has n columns, then rank A + dim Nul A = n

rank theorem

let H be a p-dimensional subspace of Rn. any linearly independent set of exactly p elements in H is automatically a basis for H. any set of p elements of H that spans H is automatically a basis for H

the basis theorem

if A is an invertible n x n matrix,

1. the columns of A form a basis of Rn

2. Col A = Rn

3. dim Col A = n

4. rank A = n

5. Nul A = {0}

6. dim Nul A = 0

1. the columns of A form a basis of Rn

2. Col A = Rn

3. dim Col A = n

4. rank A = n

5. Nul A = {0}

6. dim Nul A = 0

the invertible matrix theorem (cont.)

det A is the product of the entries on the main diagonal of A

determinant of a triangular matrix

let A be a square matrix

1. if a multiple of one row of A is added to another row to produce a matrix B, then det B = det A

2. if two rows of A are interchanged to produce B, then det B = -det A

3. if one row of A is multiplied by k to produce B, then det B = k * det A

1. if a multiple of one row of A is added to another row to produce a matrix B, then det B = det A

2. if two rows of A are interchanged to produce B, then det B = -det A

3. if one row of A is multiplied by k to produce B, then det B = k * det A

properties of determinants

a square matrix A is invertible if and only if det A does not equal 0

invertibility and determinants

if A is a square matrix, then det A^T = det A

the determinant of a transpose

if A and B are square matrices, then det AB = (det A)(det B)

multiplicative property of determinants

let A be an invertible square matrix. for any B in Rn, the unique solution x of Ax = b has entries given by

page 177

page 177

Cramer’s rule

let A be an invertible square matrix

A^-1 = 1/(det A) * adj A

A^-1 = 1/(det A) * adj A

the inverse adjugate formula

see page 190

vector space

see page 193

subspace

1. find a general solution to Ax = 0 in terms of the free variables

2. row reduce the augmented matrix [A 0] to reduced echelon form

3. {u v w} is a spanning set for the null space of A

2. row reduce the augmented matrix [A 0] to reduced echelon form

3. {u v w} is a spanning set for the null space of A

find a spanning set for the null space of the matrix A

1. write W as a set of linear combinations

2. use the vectors in the spanning set as the columns of A

2. use the vectors in the spanning set as the columns of A

find a matrix A such that W = Col A

pick any column of A that is nonzero

find a nonzero vector in Col A

1. row reduce [A 0]
2. solve for the variables in terms of the free variables

3. assign a nonzero value to the free variables

3. assign a nonzero value to the free variables

find a nonzero vector in Nul A

the set of all u in V such that T(u) = 0

kernel