0% found this document useful (0 votes)
65 views

MatrixAnalysis Review

This document provides a review of matrix analysis concepts including: 1) Determinants can be computed using Laplace expansion or minors along rows and columns. 2) Rank inequalities relate the ranks of matrices including how ranks add when matrices are multiplied. 3) The inverse of a partitioned matrix can be computed from the inverse of its components.

Uploaded by

sasuke7byl
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
65 views

MatrixAnalysis Review

This document provides a review of matrix analysis concepts including: 1) Determinants can be computed using Laplace expansion or minors along rows and columns. 2) Rank inequalities relate the ranks of matrices including how ranks add when matrices are multiplied. 3) The inverse of a partitioned matrix can be computed from the inverse of its components.

Uploaded by

sasuke7byl
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Matrix Analysis - Review

Tiep Vu
November 5, 2017

0 Review and miscellanea 0.3 Determinants


• Laplace expansion by minors along a row or column
0.1 Vector spaces n n
X X
• If S1 , S2 are two subspaces of V , then det A = (−1)i+k aik det Aik = (−1)k+j akj det Akj
k=1 k=1

dim(S1 ∩ S2 ) + dim(S1 + S2 ) = dim S1 + dim S2 0.4 Rank


0.4.1 Rank inequalities
dim(S1 ∩ S2 ) ≥ dim S1 + dim S2 − dim V
• If A ∈ Mm,n (F) then: rank A ≤ min{m, n}

0.2 Matrices • If A ∈ Mm,k (F), B ∈ Mk,n (F) then:


(rank A + rank B) − k ≤ rank AB ≤ min{rank A, rank B}
• dim(range A) + dim(nullspace A) = rank A + nullity A = n
• If A, B ∈ Mm,n (F) then:
• (AB)H = B H AH , (AB)T = AT B T , AB = A.B
|rank A − rank B| ≤ rank (A + B) ≤ rank A + rank B
• (y H x)H = y H x = xH y = y T x • If rank B = 1 then (changing one entry of a matrix can change its rank by
at most 1 ): |rank (A + B) − rank A| ≤ 1
• Some definitions:
• (Frobenius inequality): If A ∈ Mm,k (F), B ∈ Mk,p (F), C ∈ Mp,n (F) then:
symmetric AT = A skew symmetric AT = −A rank AB + rank BC ≤ rank B + rank ABC
orthogonal AT A = I skew Hermitian AH = −A
Hermitian AH = A essentially Hermitian ∃θ ∈ R:eiθ A is Hermitian with equality if and only if there are matrices X, Y such that B = BCX +
unitary AH A = I normal AH A = AAH Y AB

• Each A ∈ Mn (C) can be written in exactly one way as A = H(A) + iK(A), 0.4.2 Rank equalities
in which H(A), K(A) are Hermitian. • If A ∈ Mm,n (C), then : rank AH = rank AT = rank A = rank A
• trAAH = trAH A = i,j |aij |2 , A ∈ Mn (C) • If A ∈ Mm (F) and C ∈ Mn (F) are nonsingular and B ∈ Mm,n (F), then:
P
rank AB = rank B = rank BC = rank ABC
• range A + range B = range [A B] • If A, B ∈ M (F), then rank A = rank B ⇔ there exist a nonsingular
m,n
  X ∈ Mm (F) and a nonsingular Y ∈ Mn (F) such that B = XAY
A
• nullspace A ∩ nullspace B = nullspace • If A ∈ Mm,n (C), then: rank AH A = rank A
B

1
• If A ∈ Mm,n (F), then rank A = k ⇔ A = XY T for some X ∈ Mm,k (F) and 0.7.2 The Sherman-Morrion-Woddbury formula
Y ∈ Mn,k) (F) that each have independent columns
Let A ∈ Mn (F) be nonsingular and B = A+XRY, X ∈ Mn,r (F), Y ∈ Mr,n (F), R ∈
−1
• rank A = k  ⇐⇒ ∃ nonsingular matrices S ∈ Mn (F) and T ∈ Mn (F) such M r,r (F). If B and R + Y A−1 X are nonsingular, then:
I 0 B −1 = A−1 − A−1 X(R−1 + Y A−1 X)−1 Y A−1
that A = S k T
0 0
If r  n, then R and R−1 + Y A−1 X may be much easier to invert than B. If
x, y ∈ Fn are nonzero vectors, X = x, Y = y T y T A−1 x 6= 0, and R = [1] then:
• Let A ∈ Mm,n (F). If X ∈ Mn,k (F) and Y ∈ Mm,k (F) and if W = Y T AX is
nonsingular, then: (A + xy T )−1 = A−1 − (1 + y T A−1 x)−1 A−1 xy T A−1
In particular, if B = I + xy T for x, y ∈ Fn and y T x 6= −1, then
rank (A − AXW −1 Y T A) = rank A − rank AXW −1 Y T A
B −1 = I − (1 + y T x)−1 xy T
When k = 1 (Wedderburn’s rank-one redution formula): If x ∈ Fn and
y ∈ Fm , and if ω = y T Ax 6= 0, then: 0.7.3 Complementary nullities
Let A ∈ Mn (F) is nonsingular. The law of complementary nullities is:
rank (A − ω −1 Axy T A) = rank A − 1
nullity(A[α, β]) = nullity(A−1 [β c , αc ])
n m T
Conversely, if σ ∈ F, u ∈ F , v ∈ F , and rank (A − σuv ) < rank A, then: which is equivalent to the rank identity:
rank (A − σuv T ) = rank A − 1 and there are x ∈ Fn , y ∈ Fm such that
u = Ax, v = AT y, y T Ax 6= 0, and σ = (y T Ax)−1 rank (A[α, β]) = rank (A−1 [β c , αc ]) + r − s − n, r = |α|, s = |β|

0.7.4 Rank in a partitioned matrix and rank-principal matrices


0.5 Nonsingularity  
A11 A12
• (A−1 )T = (AT )−1 • A= , A11 ∈ Mr (F), A22 ∈ Mn−r (F). If A11 is nonsingular, then
A21 A22
 
  A11
rank A11 A12 = rank =r
0.6 The Euclidean inner product and norm A21

• hx, yi = y H x, kxk2 = hx, xi = (x ∗ x)1/2 • The converse is true:  


  A11
if rank A11 = rank A11 A12 = rank , then A11 is nonsingular.
• hαx1 + βx2 , yi = αhx1 , yi + βhx2 , yi, hx, αy1 + βy2 i = αhx, y1 i + βhx, y2 i A21

0.8 Determinants again


0.7 Partitioned sets and matrices
0.8.1 The adjugate and the inverse
0.7.1 The inverse of a partitioned matrix
  • If A ∈ Mn (F), n ≥ 2, the adjugate of A is: adj A = [(−1)i+j det A[{j}c , {i}c ]]
A B
• M= • (adj A)A = A(adj A) = (det A)I, and det(adj A) = (det A)n−1
C D
• adj (A−1 ) = A/ det A = (adj A)−1
−1 −1 −1 −1 −1
 
(A − BD C) A B(CA B − D)
M −1 = • If rank A ≤ n − 2, then adj A = 0
D−1 C(BD−1 C − A)−1 (D − CA−1 B)−1
• If rank A = n − 1 then rank adj A = 1. Suppose adj A = αxy T for some
(A − BD−1 C)−1 (BD−1 C − A)−1 BD− 1 α ∈ F and nonzero x, y ∈ Fn . From:
 
M −1 =
(CA−1 B − D)−1 CA−1 (D − CA−1 B)−1 (Ax)y T = A(adj A) = 0 = (adj A)A = x(y T A)
assume that the relevent inverses exist. we conclude that: Ax = 0 and y T A = 0

2
• adj (AB) = (adj A)(adj B) for all A, B ∈ Mn 0.8.4 Determinantal identities of Sylvester and Kronecker

• If A is nonsingular, then: 0.8.10 Derivative of the determinant


d
adj (adj A) = adj ((det A)A−1 ) = (det A)n−1 adj A−1 • det A(t) = tr(adj A(t))A0 (t)
dt
= (det A)n−1 (A/ det A) = (det A)n−2 A
d
• det(tI − A) = tr adj (tI − A)
−1 −1 dt
• If A + B is nonsingular, then : A(A + B) B = B(A + B) A, so continuity
ensures that:
0.9 Special types of matrices
Aadj (A + B)B = Badj (A + B)A
0.9.1 Bock diagonal matrices and direct sums
• (adj A)B = B(adj A) whenever AB = BA, even if A is singular.  
A11 ... 0


T  .. .. ..  in which
• A matric A ∈ Mn (F) of the form: A =  . . . 
• (adj A) = det A
∂aij 0 . . . Akk
Pk
Aii ∈ Mni (F), i = 1, . . . , k, i=1 ni = n is called block diagonal. We also
0.8.2 Minors of the inverse write:
k
det A[β, α] X X
det A−1 [αc , β c ] = (−1)p(α,β)
M
, in which p(α, β) = i+ j. A = A11 ⊕ A22 ⊕ · · · ⊕ Akk = Aii
det A i∈α j∈β i=1
−1 c det A[α]
In particular: det A [α ] = L
k
 Qk
det A • det i=1 Aii = i=1 det Aii

0.8.3 Schur complements and determinantal formulae


L  P
k k
• rank i=1 Aii = i=1 rank Aii
• Definition: The Schur complement of A[α] in A:
• If A ∈ Mn and B ∈ Mm are nonsingular, then:
A/A[α] = A[αc ] − A[αc , α]A[α]−1 A[α, αc ]
1. (A ⊕ B)−1 = A−1 ⊕ B −1
  2. (det(A ⊕ B))(A ⊕ B)−1 = (det A)(det B)(A−1 ⊕ B −1 ) =
• det A = det A[α] det A[αc ] − A[αc , α]A[α]−1 A[α, αc ] ((det B)(det A)A−1 ⊕ (det A)(det B)B −1 )
• When αc consists of a single element. Then: • a continuity argument ensures that:
 
det A = det A[α] A[αc ] − A[αc , α](A[α]−1 A[α, αc ] adj (A ⊕ B) = (det B)adj A ⊕ (det A)adj B
= A[αc ] det A − A[αc , α](adj A[α])A[α, αc ]
0.9.2 Triangular matrices
• Cauchy’s formula for the determinant of a rank-one perturbation • If T ∈ Mn is triangular, has distinct diagonal entries, and commutes with
  B ∈ Mn , then B must be triangular of the same type as T (upper, stricly
 x
det T = a det(Â − a−1 xy T ) upper, lower, stricly lower).
y a 
 x • If a square triangular matrix is nonsingular, its inverse is a triangular matrix
det T = a det  − y T (adj Â)x
y a of the same type.

• A product of square triangular matrices of the same size and type is a tri-
⇒ a det( − a−1 xy T ) = a det  − y T (adj Â)x
angular matrix of the same type; eack i, i diagonal entry of such a matrix
a = −1 ⇒ det( + xy T ) = det  + y T (adj Â)x product is the product of the i, i entries of the factors.

3
0.9.3 Permutation matrices 0.9.7 Hessenberg matrices

• A squate matric P is a permutation matric if exactly one entry in each row • A matric A = [aij ] ∈ Mn (F) is said to be in upper Hessenberg form or to be
and column is equal to 1 and all other entries are 0. an upper Hessenberg matric if aij = 0 for all i > j + 1:

• P T = P −1 and det P = ±1
 
a11 ∗
a21 a22 
 
• The product of two permuation matrices is again a permuation matrix.  .. 
A=
 a32 . 

.. ..
• A matrix A ∈ Mn such that P AP T is triangular for some permutation matric
 
 . . 
P is called essentially triangular 0 an,n−1 ann

• The n-by-n reversal matrix is the permutation matrix:


• An upper Hessenberg matrix A is unreduced if ai+1,i 6= 0 for all i =

1
 1, . . . , n − 1; the rank of such a matrix is atl least n − 1 since its first n − 1
.. colums are independent.
Kn = 
 
. 
1
0.9.8 Tridiagonal, bidiagonal, and other structured matrices
0.9.4 Ciculant matrices • A matrix that is both upper and lower Hessenberg is called tridiagonal, that
  is, A is tridiagonal if aij = 0, ∀ |i − j| > 1
a1 a2 a3 ... an
 an a1 a2 . . . an−1 
  • A Jacobi matrix is a real symmetric tridiagonal matrix with positive subdi-
A = an−1 an a1 . . . an−2 

 .. .. ..

..  agonal entries.
..
 . . . . . 
a2 a3 a4 ... a1 • A matrix A = [aij ] ∈ Mn (F) is persymmetric if aij = an+1−j,n+1−i for all
i, j = 1, . . . , n − 1
0.9.5 Toplits matrices
• A is persymmetric if Kn A = AT Kn
 
a0 a1 a2 a3 ... an
 a−1 a0 a1 a2 . . . an−1  • If A is persymmetric and invertible, then A−1 is also persymmetric since
Kn A−1 = (AKn )−1 = (Kn AT )−1 = (A−1 )T Kn
 
 a−2 a−1 a0 a1 . . . an−2 
A =  a−3
 
a−2 a−1 a0 . . . an−3 
• A is skew persymmetric if Kn A = −AT Kn . The inverse of a nonsingular
 
 .. .. .. .. .. 
 . . . . . 
skew-persymmetric matrix is skew persymmetric.
a−n a−n+1 a−n+2 a−n+3 ... a0
• A ∈ Mn is perhermitian if Kn A = AH Kn , is skew perhermitian if Kn A =
0.9.6 Hankel matrices −AH Kn
 
a0 a1 a2 ... an • A matrix A = [aij ] ∈ Mn (F) is centrosymmetric if aij = an+1−i,n+1−j for all
 a1 a2 a3 ... an+1 
  i, j = 1, . . . , n. A is centrosymmetric if Kn A = AKn .
A =  a2 a3 a4 ... an+2 


 .. .. .. .. .. 
. . . . .  • if A and B are centrosymmetric, then AB is centrosymmetric. If A and B
an an+1 an+2 ... a2n are skew centrosymmetric, then AB is centrosymmetric.

4
0.9.9 Vandermonde matrices and Lagrange interpolation 0.10 Change of basis
• A Vandermonde matrix A ∈ Mn (F) has the form 0.11 Equivalence relations
1 x1 x21 . . . xn−1
 
1 Equivalence Relation ∼ A∼B
1 x2 x22 . . . xn−1  congruence A = SBS T
2
A = .
 
.. .. . . ..  unitary congruence A = U BU T
 .. . . . . 
2 n−1 *congruence A = SBS H
1 xn xn . . . xn −1
consimilarity A = SBS
n
Y equivalence A = SBT
• det A = (xi − xj ) unitary equivalence A = U BV
1≤i<j≤n diagonal equivalence A = S1 BD2
similarity A = SBS −1
0.9.10 Cauchy matrices unitary similarity A = U BU H
triangular equivalence A = LBR
• A Cauchy matric A ∈ Mn (F) is matric of the form A = [(ai + bj )−1 ]ni,j=1 ,
in which a1 , . . . , an , b1 , . . . , bn are scalars such that ai + bj 6= 0 for all in which:
i, j = 1, . . . , n.
Q • D1 , D2 , S, T, L and R are square and nonsingular.
1≤i<j≤n (aj − ai )(bj − bi )
• det A = Q
1≤i<j≤n (ai + bj ) • U and V are unitary
• A Hilbert matrix Hn = [i + j − 1]−1 ]ni,j=1 is a Cauchy matrix that is also a • L is lower triangular
Hankel matrix.
(1!2! . . . (n − 1)!)4 • R is upper triangular
det Hn =
1!2! . . . (2n − 1)!
• D1 and D2 are diagonal
• So a Hilbert matrix is always nonsingular. The entries of its inverse
Hn−1 = [hij ]ni,j=1 are: • A and B need not be square for equivalence, unitary equivalence, triangular
equivalence, or diagonal equivalence.
(−1)i+j (n + i − 1)!(n + j − 1)!
hij =  2
(i − 1)!(j − 1)! (n − i)!(n − j)!(i + j + 1)!
1 Eigenvalues, Eigenvectors and Similarity
0.9.11 Involution, nilpotent, projection, coninvolution 1.1 The eigenvalue-eigenvector equation
A matrix A ∈ Mn (F) is • A ∈ Mn , p() is a given polynomial. Then if Ax = λx ⇒ p(A)x = p(λ)x
2
• an involution if A = I
• Theorem 1.1.6 A ∈ Mn , p() is a given polynomial. Then if Ax = λx ⇒
• nilpotent if Ak = 0 for some k ∈ N∗; the least such k is the index of nulpotence p(A)x = p(λ)x. Conversely, if k ≥ 1 and if µ is an eigenvalue of p(A), then
of A. ∃λ ∈ σ(A)|µ = p(λ)

• a projection/idempotent if A2 = A • Observation 1.1.8 Let A ∈ Mn and λ, µ ∈ C be given. Then λ ∈


σ(A) ⇐⇒ λ + µ ∈ σ(A + µI)
Suppose that F = C. A matrix A ∈ Mn is:
• a Hermitian projection/ orthogonal projection if AH = A and A2 = A. • Theorem 1.1.9 Let A ∈ Mn be given. Thus, for each (y 6= 0) ∈ Cn , ∃ a
polynomial g(t) of degree at most n − 1 such that g(A)y is an eigenvector of
• a coninvolution/coninvolutory if AA = I A.

5
1.2 The characteristic polynomial and algebraic multilicity 4. If A is similar to a matrix of the above form, then the diagonal entries
of Λ are eigenvalues of A. If A is similar to a diagonal matrix Λ, then
• Theorem 1.2.8 (Brauer’s theorem) . Let x, y ∈ C n , x 6= 0 and A ∈ Mn .
the diagonal entries of Λ are all of the eigenvalues of A
Suppose that Ax = λx and let the eigenvalues of A be λ, λ2 , . . . , λn . Then,
the eigenvalues of A + xy H are λ + y H x, λ2 , . . . , λn . In other words,
• Lemma 1.3.8 Let λ1 , . . . , λk , (k ≥ 2) be distinct eigenvalues of A ∈ Mn and
(t − λ)pA+xyH (t) = (t − (λ + y x))pA (t)H suppose that x(i) is an eigenvector associated with λi for each i = 1, . . . , k.
Then the vectors x(1) , . . . , x(k) are linearly independent.
• ak is the coefficient of tk in pA (t). Ek (A) is the sum of all k-by-k principal
minors of A. Then • Theorem 1.3.9 If A ∈ Mn has n distinct eigenvalues, then A is diagonal-
1 (k) izable.
ak = p (0) = (−1)n−k En−k (A)
k! A
• Lemma 1.3.10 Let B1 ∈ Mn1 , . . . , Bd ∈ Mnd be gien and let B be the direct
X k
Y sum
• Ek (A) = Sk (λ1 , . . . , λn ) = λij , λi ∈ σ(A)
 
B1 0
1≤i1 <···<ik ≤n j=1
B=
 ..  = B1 ⊕ · · · ⊕ Bd

.
pA (t) = tn − E1 (A)tn−1 + · · · + (−1)n−1 En−1 t + (−1)n En (A) 0 Bd

• Theorem 1.2.17 Let A ∈ Mn . There is some δ > 0 such that A + εI is Then B is diagonalizable iff each of B1 , . . . , Bd is diagonalizable.
nonsingular whenever ε ∈ C and 0 < |ε| < δ (See Observation 1.1.8)
 0 • Theorem 1.3.12 Let A, B ∈ Mn be diagonalizable. Then AB = BA iff
p (α) = · · · = p(k−1) (α) = 0 they are simultaneously diagonalizable.
• α is a zero of p(t) of multiplicity k iff
p(k) (α) 6= 0
• Theorem 1.2.18 Let A ∈ Mn and suppose that λ ∈ σ(A) has algebraic • F ⊆ Mn is a commuting family ⇒ ∃x ∈ Cn that is an eigenvector of every
multiplicity k. Then rank (A − λI) ≥ n − k with equality for k = 1. A∈F

• Let A ∈ Mn and x, y ∈ Cn be given. Let f (t) = det(A + txy T ). Then, for • F is a family of diagonalizable matrices. Then F is a commuting family ⇔
any t1 6= t2 it is a simultaneously diagonalizable family.
t2 f (t1 ) − t1 f (t2 )
det A =
t2 − t1 • A ∈ Mm,n , B ∈ Mn,m , m ≤ n. Then pBA (t) = tn−m pAB (t)

1.3 Similarity • Theorem 1.3.28 Let S ∈ Mn be nonsingular and let S = C + iD, in which
• Theorem 1.3.7 Let A ∈ Mn be given. Then C, D ∈ Mn (R). There is a real number τ such that T = C + τ D is nonsin-
gular.
1. A is similar to a bloock matrix of the form
• Theorem 1.3.29 Two real matrices that are similar over C are similar over
 
Λ C
, Λ = diag(λ1 , . . . , λk ), D ∈ Mn−k , 1 ≤ k < n R.
0 D

iff there are k linearly independent vectors in Cn , each of which is an • Theorem 1.3.31 (Misky) Let an interger n ≥ 2 and complex scalars
eigenvector of A. λ1 , . . . , λn and d1 , . . . , dn be given. There is an A ∈ Mn with eigenvalues
n n
2. The matrix A is diagonalizable iff it has n linearly independent eigen- X X
λ1 , . . . , λn and main diagonal entries d1 , . . . , dn iff λi = di .
vectors.
i=1 i=1
3. If x(1) , . . . , x(n) are linearly independent vectors of A and if S = If λ1 , . . . , λn and d1 , . . . , dn are all real and have the same sums, there is an
[x(1) , . . . , x(n) ], then S −1 AS is a diagonal matrix A ∈ Mn (R) with eigenvalues λ1 , . . . , λn and main diagonal entries d1 , . . . , dn .

6
2 Unitary equivalence and normal 2.3 Schur’s unitary triangularization theorem
• Schur’s Thr: Given A ∈ Mn with eigenvalues λ1 , . . . , λn in any prescribed
2.1 Unitary matrices order, there is a unitary matrix U ∈ Mn such that:
• Orthogonal : The vectors x1 , . . . , xk ∈ Cn form an orthogonal set if xH
i xj =
U ∗ AU = T = [tij ]
0, ∀ pairs1 ≤ i < j ≤ k.
is upper triangular, with diagonal entries tii = λi , i = 1, . . . , n. That is,
• Orthonormal : If an orthogonal set has vectors normalized, xH i xi = 1, ∀i = every square matrix A is unitarily equivalent to a triangular matrix whose
1, . . . , k, then the set is call orthonormal. An orthonormal set of cectors is
diagonal entries are the eigenvalues of A in a prescribed order.
linearly independent.
• Thr: Let F ⊆ Mn be a commuting family. There is a unitary matrix U ∈ Mn
• A matrix U ∈ Mn is said to be unitary if U H U = I. If, in addition, such that U H AU is upper triangular for every A ∈ F
U ∈ Mn (R), U is said to be real orthogonal.
• Thr: if A ∈ Mn (R), there is a real orthogonal matrix Q ∈ Mn (R) such that:
n
• Theorem: For all x ∈ C and matrix U is unitary, the Euclidean length of  
y = U x is the same as that of x; that is, y ∗ y = x ∗ x. A1 ∗
 A2 
QT AQ =   ∈ Mn (R), 1 ≤ k ≤ n (13)
 
• Theorem: Let A ∈ Mn be a nonsingular matrix. Then A−1 is similar to  ..
. 
AH iff there is a nonsingular matrix B ∈ Mn such that A = B −1 B H
0 Ak
where each Ai is a real 1-by-1 matrix, or a real 2-by-2 matrix with a non-
2.2 Unitary equivalence real pair of complex conjugate eigenvalues. The diagonal blocks Ai may be
• Def : A matrix B ∈ Mn is said to be unitarily equivalent to A ∈ Mn if there arranged in any prescribed order.
is a unitary matrx U ∈ Mn such that B = U H AU . If U may be taken to be • Thr: Let F ⊆ Mn (R) be a commuting family. There is a real orthogonal
real, then B is said to be (real) orthogonally equivalent to A. matrix Q ∈ Mn (R) such that QT AQ is of the form 13 for every A ∈ F.
• Thr: If A = [aij ] and B = [bij ] ∈ Mn are unitary equivalent, then:
2.4 Some implications of Schur’s theorem
n n
• Lem: Supose that R = [rij ], T = [tij ] ∈ Mn are upper triangular and
X X
|bij |2 = |aij |2
i,j=1 i,j=1
that rij = 0, 1 ≤ i, j ≤ k < n, and tk+1,k+1 = 0. Let T 0 = [t0ij ]RT then
t0ij = 0, 1 ≤ i, j ≤ k + 1.
• Householder transformations: Let w ∈ C n be a nonzero vector and • Cayley-Hamilton Thr: Let PA (t) be the characteristic polynomial of
define Uw ∈ Mn by Uw = I − twwH in which t = 2(wH w)−1 . Then, A ∈ Mn . Then: PA (A) = 0.

• Uw x = x if x ⊥ w and Ww w = −w • Thr: Let A = [aij ] ∈ Mn . For every  > 0, there exists a matrix
A() = [aij ()] ∈ Mn that has n distinct eigenvalues (and therefore diag-
• Uw is both unitary and Hermitian
onalizabal) and is such that:
• Specht’s Thr: Two given matrices A, B ∈ Mn are unitarily equivalent iff: n
X
tr W (A, AH ) = tr W (B, B H ) for every word W (s, t) in two noncommuting |a − [ij] − aij ()|2
variables. i,j=1

W (A, AH ) = Am1 (AH )n1 Am2 (AH )n2 . . . Amk (AH )nk • Thr: Let A ∈ Mn . For every  > 0, there exists a nonsingular matrix
S ∈ Mn such that
S−1 AS = T = [tij ()]
• Pearcy’s Thr: Two given matrices A, B ∈ Mn are unitarily equivalent iff
tr W (A, AH ) = tr W (B, B H ) for every word W (s, t) of degre at most 2n2 . is upper triangular and |Tij ()| < , ∀1 ≤ i <, > j ≤ n

7
• Thr: Suppose that A ∈ Mn has eigenvalues λ + i with multiplicity ni , i = 1. xH Ax is real for all x ∈ Cn ;
1, . . . , k and that λ1 , . . . , λk are distint. Then A is similar to a matrix of the 2. A is normal and all the eigenvalues of A is are real; or
form 
T1 0
 3. S H AS is Hermitian for all S ∈ Mn

 T2 
 • Thm: Let A ∈ Mn be gien. Then A is Hermitian iff there is a nunitarry

 . .. 
 matrix U ∈ Mn and a real diagonal matrix Λ ∈ Mn s.t. A = U ΛU H
0 Tk
where Ti ∈ Mni is upper triangular with all diagonal entries equal to 4.2 Variational characterizations of eigenvalues of Hermi-
λi , i = 1, . . . , k. If A ∈ Mn (R)) and if all the eigenvalues of A are real, tian matrices
then the same result holds, and the similarity matrix may be taken to be • Thm: Let A ∈ Mn be a Hermitian matrix with eigenvalues λ1 ≤ λ2 ≤ . . . λn ,
real. and let k be the given integer with 1 ≤ k ≤ n. Then
• Thr: Let A, B ∈ Mn have eigencalues α1 , . . . , αn and β1 , . . . , βn respectively. xH Ax
If A and B commute, there is a permutation i1 , . . . , in of the indices 1, . . . , n min max = λk
w1 ,w2 ,...,wn−k ∈Cn x6=0,x⊥w1 ,...,wn−k xH x
such that the eigenvalues of A + B are α1 + β1 , . . . , αn + βn . In particular,
σ(A + B) ⊆ σ(A) + σ(B) if A and B are commute.
xH Ax
max min = λk
w1 ,w2 ,...,wk−1 ∈Cn x6=0,x⊥w1 ,...,wk−1 xH x
2.5 Normal matrices
• Def : A matrix A ∈ Mn is said to be normal if AH A = AAH . 4.3 Some apps of the variational characterizations
• Thm: if A = [aij ] ∈ Mn has eigenvalues λ1 , . . . , λn , the following statements • Thm: Let A, B ∈ Mn be Hermitian. Let the respective eigenvalues A, B,
are equivalents: and A + B be {λ1 (A) ≤ λ2 (A) ≤ · · · ≤ λn (A)}, {λ1 (B) ≤ λ2 (B) ≤ · · · ≤
λn (B)}, and {λ1 (A + B) ≤ λ2 (A + B) ≤ · · · ≤ λn (A + B)}. Then:
1. A is normal;
λi (A + B) ≤ λi+j (A) + λn−j (B), i = 1, 2, . . . , n; j = 0, 1, . . . , n − 1
2. A is unitary diagonalizable;
Pn 2
Pn 2
3. i,j=1 |aij | = i=1 |λi | ; and • Thm: Let A, B ∈ Mn be Hermitian and let the eigenvalues λi (A), λi (B),
4. There is an orthonormal set of n eigenvectors of A. and λi (A + B) be arranged in increasing order. For each k = 1, 2, . . . , n, we
have:
• A normal matrix is nondefective (the geometric’s and algebraic’s multiplicity λk (A) + λ1 (B) ≤ λk (A + B) ≤ λk (A) + λn (B)
are the same)
• Corollary: Let A, B ∈ Mb n be Hermitian. Assume that B is positive
• If A ∈ Mn is normal, x ∈ Cn then Ax = λx ⇔ xH A = λxH semidefinite and that the eigenvalues of A and A + B are arranged in in-
creasing order. Then: λk (A) ≤ λk (A + B) for all k = 1, 2, . . . , n
• Thm: If N ⊆ Mn is a commuting family of normal matrices, then N is
simultaneously unitary diagonalizable. • Thm 4.3.4: Let A ∈ Mn be Hermitian and let z ∈ Cn be a given vector. If
the eigenvalues of A and A ± zz H are arranged in increasing order, we have:
• Lem: If A ∈ Mn is Hermitian and xH Ax ≥ 0 for all x ∈ Cn , then all the
eigenvalues of A are nonnegative. If, in addition, tr A = 0 then A = 0. 1. λk (A ± zz H ) ≤ λk+1 (A) ≤ λk+2 (A ± zz H ), k = 1, 2, . . . , n − 2
2. λk (A) ≤ λk+1 (A ± zz H ) ≤ λk+2 (A)
4 Hermitian and symmetric matrices • Let A, B ∈ Mn be Hermitian and suppose that B has rank at most r. Then:

4.1 Definitions, properties of Hermitian matrices 1. λk (A + B) ≤ λk+r (A) ≤k+2r (A), k = 1, 2, . . . , n − 2r


2. λk (A) ≤ λk+r (A + B) ≤ λk+2r (A), k = 1, 2, ..., n − 2r
• Thm: Let A = [aij ] ∈ Mn be given. Then A is Hermitian iff at least one of
the follwong holds: • C: A, B Hermitian.

8
1. j + k ≥ n + 1, then: λj+k−n (A + B) ≤ λj (A) + λk (B). ◦ 0 6= z ∈ Cn :
2. j + k ≤ n + 1, then λj (A) + λk (B) ≤ λj+k−1 (A + B)
λi (A) ≤ λi (A + zz H ) ≤ λ(A); i = 1, n − 1 (18)
H
λn (A) ≤ λn A + zz
 
A y (19)
• Thm: A Hermitian. Â = H . Then:
y a
◦ λk (A) + λ1 (B) ≤ λk (A + B) ≤ λk (A) + λn (B); k = 1, n
λ̂1 ≤ λ1 ≤ λ̂2 ≤ λ2 ≤ · · · ≤ λ̂n ≤ λn ≤ λ̂n+1 n
• Cauchy’s Thm: Let  B ∈ Mn be Hermitian, let y ∈ C and a ∈ R be a
B y
• Thm 4.3.10 If: λ̂1 ≤ λ1 ≤ λ̂2 ≤ λ2 ≤ · · · ≤ λ̂n ≤ λn ≤ λ̂n+1 . Let given, and let A = H ∈ Mn+1 . Then:
y a
Λ = diag (λ1 , . . . , λn. Then∃a, y ∈ Rn s.t. {λ̂1 , . . . , λ̂n } is the set of the real
A y λ1 (A) ≤ λ1 (B) ≤ λ2 (A) ≤ · · · ≤ λn (A) ≤ λn (B) ≤ λn+1 (A)
symmetric matrix: H
y a

• Thm: A is Hermitian, Ar is r-by-r principal submatrix of A. • Thm 4.3.10 If: λ̂1 ≤ λ1 ≤ λ̂2 ≤ λ2 ≤ · · · ≤ λ̂n ≤ λn ≤ λ̂n+1 . Let
n
Λ = diag (λ1 , . . . , λn ). Then ∃a,
 y ∈ R s.t. {λ̂1 , . . . , λ̂n } is the set of the
λk (A) ≤ λk (Ar ) ≤ λk+n−r (A), 1≤k≤r A y
real symmetric matrix: H
y a
• Def : The vector β is said to majorize the vector α if sum of k smallest • If λ1 ≤ µ1 ≤ λ2 ≤ µ2 ≤ · · · ≤ λn ≤ µn . Let Λ = diag (λ1 , . . . , λ2 ). There
elements of β ≥ that of α, for k = 1, . . . , n and equality for k = n exists z ∈ Rn such that: σ(Λ + zz T ) = {µ1 , . . . , µn }
• Thm: A is Hermitian. Then the vector of diagonal entries of A majorizes
the vector of eigenvalues of A. 5 Norms for vectors and matrices
• Thm: a1 ≤ · · · ≤ an ; λ1 ≤ · · · ≤ λn and vector a = [ai ] majorizes the vector
λ = [λi ], then ∃A = [aij ] is real symmetric s.t. aii = ai and {λi } is the set
5.4 Analytic properties of vector norms
of eigenvalues of A • Let B = {x(1) , . . . , x(n) } be a basix for V . A fuction f : V → R is said to be
pre-norm if it satisfies:
—————————
a) Positive: f (x) ≥ 0, ∀x; f (x) = 0 ⇔ x = 0
• Weyl’s Thm: Let A, B ∈ Mn be Hermitian. Let the respective eigenvalues b) Homogeneous: f (αx) = αf (x)
of A, B, and A + B be {λ1 (A) ≤ λ2 (A) ≤ · · · ≤ λn (A)}, {λ1 (B) ≤ λ2 (B) ≤
· · · ≤ λn (B)}, and {λ1 (A + B) ≤ λ2 (A + B) ≤ · · · ≤ λn (A + B)}. Then: c) Continunous: f (x(z)) is continuous on (F)n , where z = [z1 , . . . , zn ]T ∈
(F)n and x(x) = z1 x(1) + · · · + zn x(n)
λi−k+1 (A) + λk (B) ≤ λi (A + B) ≤ λi+j (A) + λn−j (B), A norm is always a pre-norm.
i = 1, n; j = 0, n − 1; k = 1, i
• Let f () be a pre-norm on V = Rn or Cn . The function:
◦ If B has exactly π positive eigenvalues, µ negative eignenvalues:
f D (y) = max Rey H x
f (x)=1
λi (A + B) ≤ λi+π (A); i = 1, n − π, (14)
λi−µ (A) ≤ λi (A + B); i = µ + 1, n (15) is called the dual norm of f . Also: f D (y) = maxf (x)=1 |y H x|. The dual
norm of a pre-norm is always a norm.
◦ If rank (B) = r < n:
◦ |y H x| ≤ f (x)f D (y)
λi (A + B) ≤ λi+r (A); i = 1, n − r (16) ◦ (k · k1 )D = k · k∞ ; (k · k∞ )D = k · k1 ;
λi−r ≤ λi (A + B); i = r + 1, n (17) ◦ (k · k2 )D = k · k2

9
◦ ((k · k)D )D = k · k • If k · k is any matrix norm, then ρ(A) ≤ kAk.

• x ∈ Cn be a given vector and let k · k be a given vector norm on Cn . The set • Let A ∈ Mn and  > 0 be given. There is a matrix norm k · k such that:
ρ(A) ≤ kAk ≤ ρ(A) + 
{y ∈ Cn : kykD kxk = y H x = 1} P∞
• If A ∈ Mn , then th series k=0 ak Ak converges if there is a matrix norm
P∞ k
is said to be the dual of x with respec to k · k k · k on Mn such that the numerical series k=1 |ak |kAk converges, or even
if the partial sums of this series are bounded.
5.5 Matrix norms • Define kAk
H
= kAH k
• We call a function k · k : Mn → R a matrix norm if
for all A, B ∈ Mn , it astisfies the following five axioms: • A matrix norm k · k on Mn is a minimal matrix norm if the only matrix
(1) kAk ≥ 0 Nonegative norm N (·) on Mn such that N (A) ≤ kAk for all A ∈ Mn is N (·) = k · k
(1a) kAk = 0 if and only if A = 0 Positive •
(2) kcAk = |c|kAk, ∀c ∈ C Homogeneous
(3) kA + Bk ≤ kAk + kBk Triangle inequality • Thm: Let k · k be a given matrix norm on Mn . Then:
(4) kABk ≤ kAkkBk Submutiplicative H
a) k · k is an induced norm if and only if k · k is an induced norm.
• Let k · k be a vector norm on Cn . Define k · k on Mn by: H
b) If the matrix norm k · k is induced by the vector norm k · k, then k · k
kAk = max kAxk is induced by the dual norm k · kD
kxk=1
c) The spectral norm k · k2 is the only matrix norm on Mn that is both
Then, k · k is a matrix norm. k · k is the matrix norm induced by the vector induced and self-adjoint.
norm k · k.
5.7 Vector norms on matrices
• The maximum column sum matrix norm k · k1 :
• If f is a pre-norm on Mn , then limk→∞ [f (Ak )]1/k exists for all A ∈ Mn and:
n
X
kAk1 = max |aij | lim [f (Ak )]1/k = ρ(A)
a≤j≤n k→∞
i=1

• The maximum row sum matrix norm k · k∞ : • For each vectr norm G(·) on Mn , there is a finite positive constant c(G) such
that c(G)G(·) is a matrix norm on Mn . If k · k is a matrix norm on Mn , and
n
X if:
kAk∞ = max |aij | Cm kAk ≤ G(A) ≤ CM kAk for all A ∈ Mn
a≤i≤n
j=1
CM
then c(G) ≤ 2
Cm
• Spectral norm k · k2 is defined on Mn by:
√ • The vector norm k · k on Cn is said to be compatible with the vector norm
kAk2 = max{ λ is an eigenvalues of AH A} G(·) on Mn if:
kAxk ≤ G(A)kxk, ∀x ∈ Cn , A ∈ Mn
• Spectrul radius ρ(A) of a matrix A ∈ Mn is:
• If k · k is a matrix norm on Mn , then there is some vectr norm on Cn that is
ρ(A) = max{|λ| : λ is an eigen value of A} compatible with it.

10

You might also like