Linear Equations Linear Algebra
Linear Equations Linear Algebra
1.2 Matrices
The essential information of a linear system can be recorded compactly in a rectangular array called a
matrix. A matrix containing only the coefficients of a linear system is called the coefficient matrix,
while a matrix also including the constant at the end of a linear equation, is called an augmented
matrix. The size of a matrix tells how many columns and rows it has. An m × n matrix has m rows
and n columns.
There are three elementary row operations. Replacement adds to one row a multiple of another.
Interchange interchanges two rows. Scaling multiplies all entries in a row by a nonzero constant. Two
matrices are row equivalent if there is a sequence of row operations that transforms one matrix into
the other. If the augmented matrices of two linear systems are row equivalent, then the two systems have
the same solution set.
1.4 Vectors
A matrix with only one column is called a vector. Two vectors are equal if, and only if, their corre-
sponding entries are equal. A vector whose entries are all zero is called the zero vector, and is denoted
1
by 0. If v1 , . . ., vp are in Rn , then the set of all linear combinations of v1 , . . ., vp is denoted by Span{v1 ,
. . ., vp } and is called the subset of Rn spanned by v1 , . . ., vp . So Span{v1 , . . ., vp } is the collection
of all vectors that can be written in the form c1 v1 + c2 v2 + . . . + cp vp with c1 , c2 , . . ., cp scalars.
2
2 Theorems
1. Each matrix is row equivalent to one, and only one, reduced echelon matrix.
2. A linear system is consistent if, and only if the rightmost column of the augmented matrix is not a
pivot column.
3. If a linear system is consistent, and if there are no free variables, there exists only 1 solution. If
there are free variables, the solution set contains infinitely many solutions.
4. A vector equation x1 a1 + x2 a2 + . . . + xn an = b has the same solution set as the linear system
whose augmented matrix is [a1 a2 . . . an b].
5. A vector b is in Span{v1 , . . ., vp } if, and only if the linear system with augmented matrix
[v1 v2 . . . vp b] has a solution.
6. If A is an m × n matrix, and if b is in Rm , the matrix equation Ax = b has the same solution set
as the linear system whose augmented matrix is [a1 a2 . . . an b].
7. The following four statements are equivalent for a particular m × n coefficient matrix A. That is,
if one is true, then all are true, and if one is false, then all are false:
(a) For each b in Rm , the equation Ax = b has a solution.
(b) Each b in Rm is a linear combination of the columns of A.
(c) The columns of A span Rm .
(d) A has a pivot position in every row.
8. The homogeneous equation Ax = 0 has a nontrivial solution if, and only if the equation has at least
one free variable.
9. If the reduced echelon form of A has d free variables, then the solution set consists of a d-dimensional
plane (that is, a line is a 1-dimensional plane, a plane is a 2-dimensional plane), which can be
described by the parametric vector equation x = a1 u1 + a2 u2 + . . . + ad ud .
10. If Ax = b is consistent for some given b, and if Ap = b, then the solution set of Ax = b is the set
of all vectors w = p + v where v is any solution of Ax = 0.
11. A indexed set S = {v1 , v2 , . . ., vp } is linearly dependent if, and only if at least one of the vectors
in S is a linear combination of the others.
12. If a set contains more vectors than there are entries in each vector, then the set is linearly dependent.
That is, any set {v1 , v2 , . . ., vp } in Rn is linearly dependent if p > n.
13. If a set S = {v1 , v2 , . . ., vp } contains the zero vector 0, then the set is linearly dependent.
14. If T : Rn → Rm is a linear transformation, then there exists a unique matrix A such that T (x) = Ax
for all x in Rn . In fact, A = [ T (e1 ) T (e2 ) . . . T (en ) ].
15. If T : Rn → Rm is a linear transformation, and T (x) = Ax, then:
(a) T is one-to-one if, and only if the equation T (x) = 0 has only the trivial solution.
(b) T is one-to-one if, and only if the columns of A are linearly independent.
(c) T maps Rn onto Rm if, and only if the columns of A span Rm .
16. If A and B are equally sized square matrices, and AB = I, then A and B are both invertible, and
A = B −1 and B = A−1 .
3
3 Calculation Rules
3.1 Vectors
Define the vectors u, v and w in Rn as follows:
u1 v1 w1
u2 v2 w2
u= .. ,
.. ,
v= w=
..
(1)
. . .
un vn wn
If c is a scalar, then the following rules apply:
u1 + v1
u2 + v2
u+v = ..
(2)
.
un + vn
cu1
cu2
cu = .
(3)
..
cun
3.2 Matrices
The product of a matrix A with size m × n and a vector x in Rn is defined as:
x1
x2
Ax = [a1 a2 . . . an ]
.. = x1 a1 + x2 a2 + . . . + xn an
(4)
.
xn
Now the following rules apply:
A(u + v) = Au + Av (5)
A(cu) = c(Au) (6)