Matrices & Linear Algebra: Vector Spaces
Matrices & Linear Algebra: Vector Spaces
the
usual
laws
of
arithmetic
Notes:
o Vector multiplication is not, in general, defined
for a vector space.
o The basic example of a vector space is a list of
n scalars, Rn. Vector addition and scalar
multiplication are defined component-wise.
Page 2 of 20
o R2 is not exactly the same as C, because C has
a rule for multiplication.
o Similarly, R3 is not quite the same as physical
space, because physical space has a rule
(Pythagoras) for the distance between two
points.
o Be positive definite
x, x 0
space.
o It follows from the above that the inner
product is antilinear in the first argument:
ax, y = a* x, y
x + y, z = x, z + y, z
o In Cn, the standard (Euclidean) inner product,
x, y = x i*yi
Page 3 of 20
x , x y, y
Or, equivalently:
x, y x y
= x | x - a* y | x - a x | y + aa * y | y
*
= x | x - a* x | y - a x | y + aa * y | y
= x 2 + a 2 y - 2 Re (a x | y
2
is
real
and
non-negative,
so
that
x 2 + a 2 y -2 a x |y 0
2
(x - a y ) +2 a x y -2 a x |y 0
And now, if we choose a = x / y , we get:
x y x |y
As required.
though
x, y = x y cos q
Page 4 of 20
[This is possible because, by the Cauchy Schwartz,
x ,y
x y
have that
x | y = x i*yi
Bases
x 1, x 2, , x m = 0
If, on the other hand, such an equation holds for nontrivial values of the coefficients, one of the vectors is
redundant, and can be written as a linear combination
Page 5 of 20
linearly
independent
set,
it
remains
linearly
independent set.
consider
vector
x V .
The
Page 6 of 20
Note that:
(ekRki ) elRlj
= dij
RR = I
In other words, transformations between orthonormal
that span an n-
e1 = u1
es ur
es
s =1 es es
r -1
er = u r -
Page 7 of 20
Inductive step
et +1 = ut +1 -
s =1 es es
=0 if v s
e u
= ut +1 ev - v t +1 [es es ]
es es
= ut +1 ev - ev ut +1
r -1
et +1 ev = ut +1 ev -
=0
So the new vector is indeed orthogonal to all the
Consider e1 e2 :
e u
e1 e2 = u1 u2 - 1 2 e1
e1 e1
u u
= u1 u2 - 1 2 [u1 e1 ]
e1 e1
u u
= u1 u2 - 1 2 [e1 e1 ]
e1 e1
=0
So the first two vectors are, indeed, orthogonal.
Matrices
ARRAY VIEWPOINT
o Matrices can be regarded, simply, as an array
of numbers, Rij.
o The rule for multiplying a matrix by a vector is
then
(Ax )i = Aij x j
Page 8 of 20
o The
rules
for
matrix
addition
and
multiplication are
of A only any x:
Ax = A(x je j ) = x j Ae j = x j Aijei = Aij x jei
So:
(Ax )i = Aij x j
This corresponds to the rule for multiplying a
matrix by a vector.
o Furthermore, the sum of two linear operators is
defined by
(A + B)x = Ax + Bx = ei (Aij + Bij ) x j
o And the product of two linear operators is
defined by
Page 9 of 20
(AB)x = A(Bx ) = A(ei Bij x j ) = (Aek )Bkj x j = ei Aik Bkj x j
x = R-1x
LINEAR OPERATORS CHANGE OF BASIS
o To find how the components of a linear
Page 10 of 20
ekRki Aij x j = ekAkj x j
Rki Aij x j = Akj x j
RAx = A x
RA (R-1x ) = A x
A = RAR-1
MATRIX MULTIPLICATION
o Matrix multiplication does not commute. But it
Hermitian Conjugate
(A )ij
= Aji*
Importantly:
(AB ) = B A
x, y = x Gy
Where G is the matrix of metric coefficients. This
Page 11 of 20
With the standard inner product, we find that the
matrix defining A is, indeed, the hermitian conjugate
of A.
Special Matrices
SYMMETRY
o A symmetric matrix is equal to its transpose
A = AT
o An hermitian matrix is equal to its hermitian
conjugate.
A = A
o An antisymmetric (or skew-symmetric) matrix
satisfies
AT = -A
o An anti-hermitian (or skew-hermitian) matrix
satisfies
A = -A
ORTHOGONALITY
o An orthogonal matrix is one whose transpose is
RELATIONSHIPS
Page 12 of 20
o If A is Hermitian, then Ai is anti-hermitian,
and vice-versa.
o If A is Hermitian, then
(Ai )n
exp(Ai ) =
n!
n =0
Is unitary.
o [This can be remembered by bearing in mind
det(A - lI ) = 0
Which is called the characteristic equation of the
matrix.
characteristic
equation,
then
there
are
Page 13 of 20
a e
a a
=0
a e
a a
=0
between
independent
and
eigenvectors.
of
linearly
Any
linear
Page 14 of 20
o The eigenvectors corresponding to distinct
eigenvalues are orthogonal.
o The eigenvalues are
l = l*
l = -l *
l * = l-1
Which are precisely the conditions for l being
real, imaginary or of unit modulus.
The method to prove these results is, in general, as
follows:
o Choose two arbitrary eigenvectors and write
Ay = my
Ax = lx
deduce
something
about
the
eigenvalues.
o Now, assume that x y
Page 15 of 20
o If all eigenvalues are < 0 (> 0), the matrix is
negative (positive) definite.
o If all eigenvalues are < 0 (> 0), the matrix is
negative (positive) semi-definite.
o A matrix is definite if it is either positive
definite or negative definite.
Diagonalization
Consider
matrix
whose
columns
are
the
AS = A e1
= l1e1
= e1
e2
en
l2e2 l3en
0
l
2
e2 en
l
n
= SL
We can therefore say that
S -1AS = L
Page 16 of 20
Provided that S is invertible ie:, provided that the
columns of S are linearly independent ie: provided
that the eigenvectors of A are linearly independent.
Notes:
o We notice that S is the transformation matrix
to
the
eigenvector
basis.
Therefore,
det(A) = li
i =1
n
tr(A) = li
i =1
Page 17 of 20
is
homogeneous
quadratic
function
ie:
Q(ax ) = a2Q(x ) .
b c y
ST AS = L
Q = li x i
i =1
Q(x ) = k = constant
In the eigenvector basis, this simplifies to
l1x 12 + l2x 22 + l3x 32 = k
Page 18 of 20
o If
(for
example),
l3 = 0 ,
we
have
the
Page 19 of 20
Page 20 of 20
H (x ) = x Ax = x i*Aij x j
U AU = L
And therefore:
H (x ) = x (U LU )x = (U x ) L (U x ) = x Lx = ln x i2
i =1