Applied Linear Algebra: Problem Set-5: Instructor: Dwaipayan Mukherjee
Applied Linear Algebra: Problem Set-5: Instructor: Dwaipayan Mukherjee
1. What are all possible eigenvalues of (a) an orthogonal projection matrix, and (b) a square matrix whose kernel
is equal to its image?
2. Show that for a nilpotent matrix A ∈ Fn×n such that An = 0, the matrix A − I must be nonsingular.
3. For A ∈ Cn×n , if every vector in Cn is an eigenvector of A, show that A must be a scalar multiple of the
identity matrix.
4. Suppose for a symmetric matrix A ∈ Rn×n , U is an A-invariant subspace. Show that U⊥ must also be A-
invariant.
5. For a matrix, A ∈ Cn×n , show that if rank(A) = r, then A can have at most r + 1 distinct eigenvalues.
6. For A ∈ Rn×n , B ∈ Rn×m and C ∈ Rp×n , show that the following subspaces are A-invariant: (a) C =
im([B AB . . . An−1 B]), and (b) Ō = ker([C T AT C T . . . (An−1 )T C T ]T ).
7. Let W ⊆ Rn be an A-invariant subspace such that ⟨w1 , w2 , . . . , wk ⟩ = W. Show that ∃ S ∈ Rk×k such that
AW = W S. where W = [w1 w2 . . . wk ]. Further, prove that A and S share at least r eigenvalues, where
r = rank(W ).
8. For β ∈ C, which is not an eigenvalue of A ∈ Cn×n , show that v ∈ Cn is an eigenvector of A if and only if it is
an eigenvector of (A − βI)−1 .
9. Show that for a matrix A ∈ Fn×n , with n distinct eigenvalues, there are 2n A-invariant subspaces.
||Ax||2
10. The spectral norm of a matrix, A ∈ Rm×n , is defined as ||A||2 := maxx̸=0 , where x ∈ Rn and the vector
||x||2
norms in Rn and Rm result from the conventional inner products on those spaces. If the non-zero singular
values of A are given by σ1 ≥ σ2 ≥ . . . ≥ σr > 0 (r ≤ min(m, n)), then prove that ||A||2 = σ1 .
P 21
m Pn
11. The Frobenius norm of a matrix A ∈ Rm×n is given by ||A||F := i=1 j=1 |aij |2 . If the non-zero singular
1
values of A are given by σ1 ≥ σ2 ≥ . . . ≥ σr > 0 (r ≤ min(m, n)), then prove that ||A||F = (σ12 + σ22 + . . . σr2 ) 2 .
Σr×r 0
12. For a matrix A ∈ Rm×n whose SVD is given by A = U V T , the Moore-Penrose pseudoinverse is
0 0
−1
† Σr×r 0 T
given by A = V U . Prove the following:
0 0
(a) A−1 = A† when A is invertible
(b) (A† )† = A
(c) (A† )T = (AT )†
(d) A† = (AT A)−1 AT when A has full column rank and A† = AT (AAT )−1 when A has full row rank.
(e) (AT A)† = A† (AT )† and (AAT )† = (AT )† A†
13. (a) Argue why complex matrices A and AH (for matrices over C, (·)H denotes entry-wise conjugation of the
transpose of a matrix) must have eigenvalues that are conjugates of one another.
(b) If u and v are eigenvectors of complex matrices A and AH , respectively, for two eigenvalues that are not
complex conjugates of one another, show that u and v must be orthogonal to each other.
(c) For a matrix A ∈ Rn×n , prove that if λ is an eigenvalue with algebraic multiplicity of one (also called
a simple eigenvalue), the left and right eigenvectors corresponding to λ cannot be orthogonal, irrespective of
whether A is diagonalizable or not (left eigenvector is defined as v ̸= 0 such that v ∗ A = λv ∗ for some eigenvalue
λ).
14. Suppose we have an ordered basis, B = {v1 , v2 , . . . , vn }, for V such that ⟨v1 , v2 , . . . , vk ⟩ is A-invariant for
all k = 1, 2, . . . , n. What can you say about the structure of the matrix representation of a linear operator
φ : V → V, given by [φ]B ?
∗ Asst. Professor, Electrical Engineering, Office: EE 214D, e-mail: [email protected]
1
15. For a vector space V over C and a polynomial p ∈ C[s], consider a linear operator A : V → V and z ∈ C. Show
that z is an eigenvalue of p(A) if and only if z = p(λ) for some eigenvalue, λ, of A.
16. For a linear operator A : V → V on a finite dimensional vector space, a positive integer m, and a vector v ∈ V,
suppose Am−1 v ̸= 0, but Am v = 0. Show that {v, Av, A2 v, . . . , Am−1 v} is linearly independent.
17. Prove that A = uv T ∈ Rn×n , for u, v ∈ Rn , is diagonalizable if and only if v T u ̸= 0.
18. Using the principle of mathematical induction, show that a set of eigenvectors corresponding to distinct eigen-
values is linearly independent.
19. If A ∈ Cn×n has a set of n orthonormal eigenvectors, it must be a Hermitian matrix. Prove it or give a
counterexample if it is not true.
20. Which of the following are ideals and why (or why not)?
(a) {all polynomials in C[x] having the constant term equal to zero}
(b) {all polynomials in C[x] containing only even degree terms}
(c) {0, 2, 4} ⊆ Z6
(d) {all polynomials in Z[x] with even coefficients}
(e) R ⊆ R[x]
21. Explain why for the Jordan form representation of a nilpotent matrix, N ∈ Cn×n , the number of Jordan blocks
of size i × i or greater is equal to rank(N i−1 )−rank(N i ).
22. For a non-zero vector v ∈ Cn , and a matrix A ∈ Cn×n , the sequence of vectors {v, Av, A2 v, . . . , Aj v} is called
a Krylov sequence (named after the Russian mathematician Alexei Nikolaevich Krylov), and the subspace Kj
spanned by vectors in the sequence is called a Krylov subspace. Suppose a Krylov P subspace for vector v, say
n
Kn−1 , spans Cn , while the characteristic polynomial for A is given by χA (x) = i
i=0 αi x . Represent the
matrix A in terms of the ordered basis given by the corresponding Krylov sequence. What is the minimal
polynomial, µA (x), and why?
Pk=n
23. For A ∈ Rn×n , suppose {w1 , w2 , . . . , wn } is a set of n linearly independent eigenvectors and for w = k=1 kwk ,
Pk=n
we have Aw = k=1 (k 2 + k)wk . Then the minimal polynomial, µA (x), and characteristic polynomial, χA (x),
of A are identical. Prove or disprove this assertion.
24. Is Z[x] a PID? Justify your answer.
25. Show that for any matrix A ∈ Cn×n , we have limk→∞ Ak = 0 if and only if the spectral radius (largest modulus
among all eigenvalues) of A is less than unity.
26. Show that a matrix is nilpotent if and only if all its eigenvalues are zero. Can a non-zero diagonalizable matrix
be nilpotent? Justify your answer. Suppose for N ∈ Rn×n , we have trace(N k ) = 0 ∀k ∈ Z+ . Show that N
must be nilpotent.
27. Consider two diagonalizable matrices, A, B ∈ Rn×n . Show that the following statements are equivalent: 1.
AB = BA, and 2. There exists an invertible matrix S ∈ Rn×n such that S −1 AS and S −1 BS are both diagonal.
−π/2 π/2
28. Evaluate cos(A), when A = .
π/2 −π/2
2
31. Show that det (eA ) = etrace(A) .
32. Show that whenever AB = BA, we have eA+B = eA eB for A, B ∈ Rn×n . Provide a counterexample to this
assertion if AB ̸= BA.
36. (Graph theory) The distance between two nodes u, v ∈ V in an undirected graph G = (V, E), denoted dG (u, v),
is the length of the shortest path between them (i.e., the number of edges in that path). The diameter of
an undirected graph, G, denoted as dia(G) is the maximum distance between any pair of vertices in G. Show
that for a connected, undirected graph G, with n vertices, the adjacency matrix, A(G), satisfies the following
properties:
(a) If u, v are vertices of G, with dG (u, v) = m, then In , A(G), . . . , A(G)m are linearly independent.
(b) If A(G) has k distinct eigenvalues and dia(G) = q, then k > q.
37. Suppose the minimal polynomial of a matrix A ∈ Cn×n , say µA (x), admits a coprime factorization given by
µA (x) = (x − λ)k p(x), while the algebraic multiplicity of the eigenvalue, λ, equals m. Show that dim(Ker(A −
λI)k ) = m.
38. If A ∈ Cn×n is a diagonalizable matrix with characteristic polynomial χA (x) = (x − 1)k1 (x + 1)k2 xk3 , then the
rank of A can be increased by adding or subtracting the identity matrix to it. Prove or disprove the assertion.
39. Suppose A ∈ Rn×n has a characteristic polynomial given by χA (x) = (x − λ1 )k1 (x − λ2 )k2 and rank(A − λ1 I) =
n − k1 . Then A must be diagonalizable. Prove it or give a counterexample if it is not true.
40. Prove that for an operator φ : V → V, with dim(V) = n < ∞, we have V = Ker(φn ) ⊕ im(φn ).
[I was advised] to read [Camille] Jordan’s ‘Cours d’analyse’; and I shall never forget the astonishment with which
I read that remarkable work, the first inspiration for so many mathematicians of my generation, and learnt for the
first time as I read it what mathematics really meant.
—‘A Mathematician’s Apology (1940)’ G. H. Hardy