Linear Algebra Assignments IITK
Linear Algebra Assignments IITK
MTH102A
(1) Show that matrix multiplication is associative i.e. A(BC) = (AB)C whenever the
multiplication is defined.
(2) Suppose A and B are matrices of order m × n such that Ax̄ = B x̄ for all x̄ ∈ Rn .
Prove that A = B.
(3) Let A = (aij ) be a matrix. Transpose of A, denoted by AT , is defined to be
AT = (bij ) where bij = aji .
(i) Show that (A + B)T = AT + B T , whenever A + B is defined
(ii) Show that (AB)T = B T AT , whenever AB is defined.
(4) A square matrix A is said to be symmetric if A = AT and a square matrix A is
said to be skew symmetric if A = −AT .
Prove that a square matrix can be written as a sum of symmetric and a skew
symmetric matrix.
(5) A square matrix A is said to be nilpotent if An = 0 for some natural number n.
(i) Give examples of non-zero nilpotent matrices,
(ii) Prove that if A is nilpotent then A + I is an invertible matrix, where I is the
identity matrix.
(6) Trace of a square matrix A, denoted by T r(A), is defined to be the sum of all
diagonal entries.
(i) Suppose A, B are two square matrices of same order. Prove that T r(AB) =
T r(BA).
(ii) Show that for an invertible matrix A, T r(ABA−1 ) = T r(B).
(7) Apply Gauss elimination method to solve the system 2x+y+2z = 3, 3x−y+4z = 7
and 4x + 3y + 6z = 5. " #
2 6 −2
(8) Find the row reduced echelon form of the matrix .
3 −2 8
1
ASSIGNMENT 1
SOLUTIONS
MTH102A
(1) Show that matrix multiplication is associative i.e. A(BC) = (AB)C when-
ever the multiplication is defined.
Solution. Let A = (aij )m×n ,B = (bij )n×p ,C = (aij )p×q be matrices.
We first consider A(BC)
Let R = (rij )n×q = BC, S = (sij )m×q = A(BC), Then
n
X
sij = ail rlj
l=1
p
X
rlj = blk ckj
k=1
n p
X X
sij = ail ( blk ckj )
l=1 k=1
n p
XX
=⇒ sij = ail blk ckj .
l=1 k=1
Similarly for (AB)C, taking R = (rij )m×p = AB and S = (sij )m×q = (AB)C
we get
n X p
X
sij = ail blk ckj
l=1 k=1
Therefore A(BC) = (AB)C.
(2) Suppose A and B are matrices of order m × n such that Ax̄ = B x̄ for all
x̄ ∈ Rn . Prove that A = B.
Solution. Let A = (aij )m×n and B = (aij )m×n be matrices. Take x̄ = ej
where ej = (0, .., 1, .., 0) ∈ Rn , (1 at the jth position).
So Aej = Bej =⇒ jth coloumn of A and B is same for j = 1 to n
Hence A = B.
(3) Let A = (aij ) be a matrix. Transpose of A, denoted by AT , is defined to be
AT = (bij ) where bij = aji .
(i) Show that (A + B)T = AT + B T , whenever A + B is defined
(ii) Show that (AB)T = B T AT , whenever AB is defined.
0 0
Solution. (i) Let A = (aij )n×n , AT = (aij )n×n , B = (bij )n×n , B T = (bij )n×n
0 0
, (A + B) = (cij ) and (A + B)T = (dij ). Then dij = cji = aji + bji = (aij + bij .
Hence (A + B)T = AT + B T .
1
2 ASSIGNMENT 1 SOLUTIONS MTH102A
(ii) Let AB = (cij )n×n and (AB)T = (dij )n×n . Then dij = cji =
Pn Pn 0 0 Pn 0 0 T T T
k=1 ajk bki = k=1 akj bik = k=1 bik akj . Hence (AB) = B A .
(4) A square matrix A is said to be symmetric if A = AT and a square matrix
A is said to be skew symmetric if A = −AT .
Prove that a square matrix can be written as a sum of symmetric and skew
symmetric matrix.
Solution. A = 12 A + AT + 12 A − AT .
T
1
= 21 AT + (AT )T = 21 A + AT .
T
Now 2 A + A
T
1
= 21 AT − (AT )T = − 21 A + AT .
T
2
A−A
(5) A square matrix A is said to be nilpotent if An = 0 for some natural number
n.
(i) Give examples of non-zero nilpotent matrices,
(ii) Prove that if A is nilpotent then A + I is an invertible matrix, where I
is identity matrix.
Solution. (i) Strictly upper triangular matrices (i.e. aij = 0 for all i ≤ j) are
nilpotent.
2 1 2 3 2 1 2 3 2 1 2 3
R2 →R2 −(3/2)R1 R3 →R3 −2R1
3 −1 4 7 −−−−−−−−−−→ 0 −5/2 1 5/2 −−−−−−−→ 0 −5/2 1 5/2
4 3 6 5 4 3 6 5 0 1 2 −1
2 1 2 3
R3 →R3 +(2/5)R2
−−−−−−−−−−→ 0 −5/2 1 5/2
0 0 12/5 0
We can thus obtain the solution to the given linear system by solving the equivalent
system
2x + y + 2z = 3
(−5/2)y + z = 5/2
(12/5)z = 0
The solution is x = 2, y = −1 and z = 0. " #
2 6 −2
(8) Find the row reduced echelon form of the matrix .
3 −2 8
Solution.
" # " # " #
2 6 −2 R1 → 21 R1 1 3 −1 R2 →R2 −3R1 1 3 −1
−−−−−→ −−−−−−−→
3 −2 8 3 −2 8 0 −11 11
" # " #
R2 → −1 R 2 1 3 −1 R →R −3R 1 0 2
−−−−11−−→ −−1
−−−1
−−→ 2
0 1 −1 0 1 −1
ASSIGNMENT 2
MTH102A
1 2 2
(1) Using Gauss Jordan elimination method find the inverse of 2 1 2 .
2 2 1
(2) Let σ ∈ S5 be given by !
1 2 3 4 5
5 4 1 2 3
2 2 1
(4) Let A be a square matrix of order n. Show that det(A) = 0 if and only if there
exists a non-zero vector X = (x1 , x2 , ..., xn ) such that AX T = 0.
(5) (Vandermonde Matrix) Find the determinant of the following matrix:
1 x1 x21 ... x1n−1
1 x x2 ... xn−1
2 2 2
.. .. .. ... ..
1 xn x2n ... xnn−1
(6) Let A be a n × n real matrix. Show that det(adj(A)) = (det(A))n−1 and
adj(adj(A)) = (det(A))n−2 .A.
(7) Using Cramer’s rule solve the following system:
x + 2y + 3z = 1
−x + 2z = 2
−2y + z = −2
1
ASSIGNMENT 2
MTH102A
1 2 2
(1) Using Gauss Jordan elimination method find the inverse of 2 1 2 .
2 2 1
1 2 2 1 0 0 1 2 2 1 0 0
R2 →R2 −2R1 R3 →R3 −(2/3)R2
2 1 2 0 1 0 −−−−−−−−→ 0 −3 −2 −2 1 0 −−−−−−−−−−−→
R3 →R3 −2R1
2 2 1 0 0 1 0 −2 −3 −2 0 1
1 2 2 1 0 0 1 2 2 1 0 0
R3 →(−3/5)R3
0 −3 −2 −2 1 0 −−−−−−−−−→ 0 −3 −2 −2 1 0
2 2 1
Ans: We have S3 = {identity, (12), (23), (13), (123), (132)}
1
2 ASSIGNMENT 2 MTH102A
0 −2 1 z −2
1 2 3 1 1 3 1 2 1
We have A1 = 2 0 2 , A2 = −1 2 2 and A3 = −1 0 2
−2 −2 1 0 −2 1 0 −2 −2
Note that det(A) = 12, det(A1 ) = −20, det(A2 ) = 13 and det(A3 ) = 2.
So we have x = det(A 1) −20 det(A2 ) 13 det(A3 )
det(A) = 12 , y = det(A) = 12 and z = det(A) = 12 .
2
ASSIGNMENT 3
MTH102A
1
ASSIGNMENT 3
MTH102A
(ii) {(1, 0, 0, 0), (1, 1, 0, 0), (1, 1, 1, 0), (3, 2, 1, 0)} in R4 as a vector space
over R ,
Extending {v, w} we have the set {((0, 1, 0, 0), (1, 1, 0, 0), (0, 0, 1, 0), (0, 0, 0, 1)}
a basis of R4 .
Extending {u, w} we have the set {((1, 0, 0, 0), (1, 1, 0, 0), (0, 0, 1, 0), (0, 0, 0, 1)}
a basis of R4 .
(8) Let A be a n × n matrix over R. Then A is invertible iff the row vectors are
linearly independent over R iff the column vectors are linearly independent
over R.
Solutions: We know that A is invertible iff the row reduced echelon form
in the identity matrix iff the system Ax = 0 has only the trivial solution
x = 0. Let C1 , C2 , · · · , Cn be the column vectors of A. So A is invertible iff
b1 C1 + b2 C2 + · · · + bn Cn = 0 for some bi ∈ R implies bi = 0 for all i. So A
is invertible iff the column vectors of A are linearly independent over R.
A is invertible iff AT is invertible. So the row vectors of A are linearly
independent over R iff the column vectors of A are linearly independent over
R.
(9) Determine if the set T = {1, x2 − x + 5, 4x3 − x2 + 5x, 3x + 2} is a basis for
the vector space of polynomials in x of degree ≤ 4. Is this set a basis for the
vector space of polynomials in x of degree ≤ 3 ?
Solution If a.1 + b(x2 − x + 5) + c(4x3 − x2 + 5x) + d(3x + 2) = 0 then
equating the coefficients we get a = b = c = d = 0. So the set is linearly
independent. But x4 ∈ / Span(T ). So it is not a basis for the vector space
of polynomials in x of degree ≤ 4. However since dimension of the vector
space of polynomials in x of degree ≤ 3 is 4 the linearly independent set
T = {1, x2 − x + 5, 4x3 − x2 + 5x, 3x + 2} is a basis.
ASSIGNMENT 4
MTH102A
(1) Let {w1 , w2 , ..., wn } be a basis of a finite dimensional vector space V . Let v be a
non zero vector in V . Show that there exists wi such that if we replace wi by v in
the basis it still remains a basis of V .
(2) Find the dimension of the following vector spaces :
(i) X is the set of all real upper triangular matrices,
(ii) Y is the set of all real symmetric matrices,
(iii) Z is the set of all real skew symmetric matrices,
(iv) W is the set of all real matrices with T r(A) = 0
(3) Let P2 (X, R) be the vector space of all polynomials in X of degree less or equal
to 2. Show that B = {X + 1, X 2 − X + 1, X 2 + X − 1} is a basis of P2 (X, R).
Determine the coordinates of the vectors 2X − 1, 1 + X 2 , X 2 + 5X − 1 with respect
to B.
(4) Let W be a subspace of a finite dimensional vector space V
(i) Show that there is a subspace U of V such that V = W + U and W ∩ U = {0},
(ii) Show that there is no subspace U of V such that W ∩ U = {0} and dim(W ) +
dim(U ) > dim(V ).
(5) Decide which of the following maps are linear transformations:
(i) T : R3 → R3 defined by T (x, y, z) = (x + 2y, z, |x|),
(ii) Let Mn (R) be the set of all n × n real matrices and T : Mn (R) → Mn (R)
defined by
(a) T (A) = AT ,
(b) T (A) = I + A, where I is the identity matrix of order n,
(c) T (A) = BAB −1 , where B ∈ Mn (R) is an invertible matrix.
(6) Let T : C → C be defined by T (z) = z. Show that T is R-linear but not C-linear.
(7) Let T : R3 → R3 be a linear transformation such that T (1, 0, 0) = (1, 0, 0), T (1, 1, 0) =
(1, 1, 1), T (1, 1, 1) = (1, 1, 0). Find T (x, y, z), Ker(T ), R(T ) (Range of T ). Prove
that T 3 = T .
(8) Find all linear transformations from Rn to R.
(9) Let Pn (X, R) be the vector space of all polynomials in X of degree less or equal to
n. Let T be the differentiation transformation from Pn (X, R) to Pn (X, R). Find
Range(T ) and Ker(T ).
1
ASSIGNMENT 4
MTH102A
(1) Let {w1 , w2 , ..., wn } be a basis of a finite dimensional vector space V . Let v be a
non zero vector in V . Show that there exists wi such that if we replace wi by v in
the basis it still remains a basis of V .
P
Solution. Let v = n1 ai wi for some a1 , ..., an ∈ F. Since v is non-zero, ai 6= 0
P
for some 1 ≤ i ≤ n. Assume a1 6= 0. Write w1 = a11 v − ni=2 aa1i wi . Replace w1
by v. Clearly {v, w2 , ...., wn } spans V . Now we show that this set is L.I.. Let
P P P
b1 , ...., bn be such that b1 v + nj=2 bj wj = 0. Then b1 ni=1 ai wi + nj=2 bj wj = 0
P
which implies b1 a1 w1 + nj=2 (b1 aj + bj )wj = 0. Since w1 , w2 , · · · wn are LI we have
b1 a1 = 0, b1 aj +bj = 0 for 2 ≤ j ≤ n. Since a1 6= 0 we have bj = 0 for all 1 ≤ j ≤ n.
Solution. Let Eij be the matrix with ij th entry one and all other entries are
zero , Fij be the matrix with ij th and jith entries are 1 and all other entries are
zero and for i 6= j define Dij to be the matrix with ij th entry 1, jith entry −1 and
all other entries are zero.
2
(i) The set {Eij ; i ≤ j} forms a basis for X. Hence dim(X) = n + n 2−n = n(n+1) 2 .
n(n+1)
(ii) The set {F ij; i ≤ j} is a basis for Y . Hence dim(Y ) = 2 .
(iii) For a real skew-symmetric matrix all diagonal entries are zero. Then the set
2
{Dij ; i < j} is a basis. Hence dim(Z) = n 2−n = n(n−1) 2 P.
n
(iv) Let A be a matrix with trace zero. Then i=1 aii = 0. So a11 =
−(a22 + ... + ann ). The set {Eij , Eji : i 6= j} ∪ {Eii − Ei+1,i+1 : 1 ≤ i ≤ n − 1} is a
basis of W . Hence dim(W ) = n2 − 1.
(3) Let P2 (X, R) be the vector space of all polynomials in X of degree less or equal
to 2. Show that B = {X + 1, X 2 − X + 1, X 2 + X − 1} is a basis of P2 (X, R).
Determine the coordinates of the vectors 2X − 1, 1 + X 2 , X 2 + 5X − 1 with respect
to B.
1
2 ASSIGNMENT 4 MTH102A
Solution. First we show that the set B is L.I.. Let a0 , a1 , a2 ∈ R such that
a0 (X + 1) + a1 (X 2 − X + 1) + a2 (X 2 + X − 1) = 0. Then a0 + a1 − a2 + (a0 − a1 +
a2 )X + (a1 + a2 )X 2 = 0. From here we will get a0 = a1 = a2 = 0.
Let p(X) = a0 + a1 X + a2 X 2 be any element in P2 (X, R). Let b0 , b1 , b2 ∈ R
such that p(X) = a0 + a1 X + a2 X 2 = b0 (X + 1) + b1 (X 2 + X − 1) + b3 (X 2 − X + 1)
then we get b0 = a0 +a2 , b1 =
1 a0 −a1 +2a2
4 , b2 = a1 −a40 +2a2 . Hence B spans P2 (X, R).
Solution.
(i) Let dim(V ) = n, since V is finite dimensional, W is also finite dimensional. Let
dim(W ) = k and let B = {w1 , ..., wk } be a basis for W . In case k = n we take
U = {0}. If k < n we extend B to a basis B1 = {w1 , ..., wk , vk+1 , ..., vn } of V . Let
U be the subspace of V generated by {vk+1 , ..., vn }.
Let v ∈ V . Then there exist scalars a1 , ..., an ∈ F such that v = a1 w1 + .... +
ak wk +ak+1 vk+1 +...+an vn = (a1 w1 +....+ak wk )+(ak+1 vk+1 +...+an vn ) ∈ W +U .
Now we show that W ∩ U = {0}. Let v ∈ W ∩ U . Then there exist scalars
a1 , ..., ak and bk+1 , ..., bn such that a1 w1 + ... + ak wk = v = bk+1 vk+1 + .... + bn vn .
We have a1 w1 + ... + ak wk − bk+1 vk+1 − .... − bn vn = 0 Since B1 is a basis,
a1 = ... = ak = bk+1 = ... = bn = 0. Hence W ∩ U = {0}.
(ii) Let W ∩ U = {0}. Then dim(W + U ) = dim(W ) + dim(U ) − dim(W ∩ U ) =
dim(W ) + dim(U ). So if dim(W ) + dim(U ) > dim(V ) we have dim(W + U ) >
dim(V ), which is a contradiction as W + U is a subspace of V .
Solution.
(i) Not a linear transformation.
ASSIGNMENT 4 MTH102A 3
Take a = −1. Then T (a(1, 0, 1)) = T (−1, 0, −1) = (−1, −1, 1) 6= aT ((1, 0, 1)) =
−1(1, 1, 1) = (−1, −1, −1).
(ii) (a) Linear Transformation.
Let A, B ∈ Mn (R) and a ∈ R. Then T (A + aB) = A + aB T = AT + aB T .
(b) Not a linear transformation.
Let O be the zero matrix. Then T (O) = I 6= O.
(c) Linear transformation.
Let P, Q ∈ Mn (R) and a ∈ R. Then T (P + aQ) = B(P + aQ)B −1 =
BP B −1 + aBQB −1 = T (P ) + aT (Q).
(6) Let T : C → C be defined by T (z) = z. Show that T is R-linear but not C-linear.
(7) Let T : R3 → R3 be a linear transformation such that T (1, 0, 0) = (1, 0, 0), T (1, 1, 0) =
(1, 1, 1), T (1, 1, 1) = (1, 1, 0). Find T (x, y, z), Ker(T ), R(T ) (Range of T ). Prove
that T 3 = T .
(9) Let Pn (X, R) be the vector space of all polynomials in X of degree less or equal to
n. Let T be the differentiation transformation from Pn (X, R) to Pn (X, R). Find
Range(T ) and Ker(T ).
Solution. If f (X) = a0 + a1 X + a2 X 2 + · · · + an X n ∈ Pn (X, R) then T (f ) =
a1 + 2a2 X + · · · + nan X n−1 . So Range(T ) = Span{1, X, · · · X n−1 }.
Ker(T ) = {f : T (f ) = f ′ = 0}. So Ker(T ) = {f (X) : f (X) = c f or c ∈ R} =
Span{1} = R.
ASSIGNMENT 5
MTH102A
(1) Show that there does not exist a linear map from R5 to R2 whose kernel is
{(x1 , x2 , x3 , x4 , x5 ) : x1 = 3x2 and x3 = x4 = x5 }.
Solutions: If φ : R5 → R2 is any linear map, then the rank-nullity theorem
tells us that
5 = dim(Ker(φ)) + dim(Im(φ)).
Since Im(φ) ⊂ R2 , its dimension is at most 2, so that dim(Ker(φ)) ≥ 3. The
subspace in the question is
Span{(3, 1, 0, 0, 0), (0, 0, 1, 1, 1)},
which is 2-dimensional. So it cannot possibly be the kernel of a linear map φ :
R5 → R2 .
(2) Find a basis for the kernel and the basis for the image of the linear transformation
T : P2 (R) → P2 (R) given by T (p) = p′ + p′′ where P2 (R) is the vector space of
polynomials in x of degree less than or equal to n.
Solution: Note that T (ax2 + bx + c) = 2ax + 2a + b. Now
ker T = {ax2 + bx + c : 2ax + b + 2a = 0}
that is 2a = 0 and 2a + b = 0. So a = b = 0. So Ker(T ) is the set of all constant
(degree 0) polynomials which can be identified with R. For the image note that
T (1) = 0, T (x) = 1, T (x2 ) = 2x + 2. So Range(T ) = Span{1, x}.
(3) Find the matrix of the differentiation map on the vector space of polynomials in
x of degree less than or equal to n with respect to the standard basis and verify
the Rank-Nullity theorem.
Solution: The standard basis in this case is B = {1, x, x2 , · · · , xn }. Let D
denotes the differentiation map. Then D(1) = 0, D(x) = 1, · · · , D(xn ) = nxn−1 .
So the matrix with respect to B is
0 1 0 ... 0
0 0 2 ... 0
. .. .. . . ..
..
[D]B = . . . .
0 0 0 ... n
0 0 0 ... 0
So Range(D) = Span{1, x, · · · xn−1 } and Ker(D) = Span{1} = R. Since
{1, x, · · · xn−1 } is LI, rank(D) = n. N ullity(D) = 1. So rank(D) + N ullity(D) =
n + 1.
1
2 ASSIGNMENT 5 MTH102A
(4) Determine the quotient vector space M3 (R)/W , where M3 (R) is the vector space
of all 3 × 3 real matrices and W is the subspace of symmetric matrices, that is
W = {A ∈ M3 (R) : A = At }.
Solution: Let U = {A ∈ M3 (R) : AT = −A} and define T : M3 (R) → U by
A 7→ A − AT . Then T is a linear map since
A + B 7→ A + B − (A + B)T = (A − AT ) + (B − B T ) and a.A 7→ a.A − a.AT =
a.(A−AT ). Again T is onto because if B ∈ U then B = −B T and 41 (B −B T ) 7→ B.
We also have Ker(T ) = W . So M3 (R)/W ∼ = U.
(5) Find the matrix of the linear transformation T : R4 → R4 , with respect to
the standard basis of R4 such that Ker(T ) = Span{(2, 1, 1, 2), (1, 2, 1, 1)} and
Range(T ) = Span{(1, 0, 1, 0), (0, 1, 1, 1)}.
Solution: Note that the standard basis of R4 is
B = {(1, 0, 0, 0), (0, 1, 0, 0), (0, 0, 1, 0), (0, 0, 0, 1)}.
Since {(2, 1, 1, 2), (1, 2, 1, 1)} is LI we extend it to a basis of R4 . So we take
{(1, 0, 0, 0), (2, 1, 1, 2), (1, 2, 1, 1), (0, 0, 0, 1)} to be a basis of R4 . Now we define:
T (1, 0, 0, 0) = (1, 0, 1, 0),
T (2, 1, 1, 2) = (0, 0, 0, 0),
T (1, 2, 1, 1) = (0, 0, 0, 0),
T (0, 0, 0, 1) = (0, 1, 1, 1).
Now, we have to compute the value of T on the vectors (0, 1, 0, 0), (0, 0, 1, 0). In
fact, we have that: (0, 1, 0, 0) = (1, 2, 1, 1) − (2, 1, 1, 2) + (1, 0, 0, 0) + (0, 0, 0, 1) and
since T is linear, we have:
(1) T (0, 1, 0, 0) = T (1, 2, 1, 1) − T (2, 1, 1, 2) + T (1, 0, 0, 0) + T (0, 0, 0, 1)
(2) = T (1, 0, 0, 0) + T (0, 0, 0, 1)
(3) = (1, 0, 1, 0) + (0, 1, 1, 1)
(4) = (1, 1, 2, 1).
Again (0, 0, 1, 0) = (1, 2, 1, 1) − (1, 0, 0, 0) − (0, 0, 0, 1) − (0, 1, 0, 0), so that:
(5) T (0, 0, 1, 0) = T (1, 2, 1, 1) − T (1, 0, 0, 0) − T (0, 0, 0, 1) − T (0, 1, 0, 0)
(6) = −T (1, 0, 0, 0) − T (0, 0, 0, 1) − T (0, 1, 0, 0)
(7) = −(1, 0, 1, 0) − (0, 1, 1, 1) − (1, 1, 2, 1)
(8) = (−2, −2, −4, −2).
Summarizing the above, we have obtained:
(9) T (1, 0, 0, 0) = (1, 0, 1, 0)
(10) T (0, 1, 0, 0) = (1, 1, 2, 1)
(11) T (0, 0, 1, 0) = (−2, −2, −4, −2)
(12) T (0, 0, 0, 1) = (0, 1, 1, 1).
ASSIGNMENT 5 MTH102A 3
Ax = y = 0
which means x ∈ N ull(A). Therefore N ull(AT A) ⊆ N ull(A). So we showed
that N ull(AT A) = N ull(A). By Rank nullity theorem we have N ullity(AT A) +
rank(AT A) = number of columns of AT A. But number of columns of AT A=
number of columns of A. So rank(AAT ) = rank(A).
(8) Let V be a n dimensional vector space and W be a m dimensional vector space.
Let L(V, W ) be the vector space of all linear maps from V to W . Find a basis for
L(V, W ).
4 ASSIGNMENT 5 MTH102A
(1) Let A and B be square matrices of same order. Prove that characteristic polyno-
mials of AB and BA are same. Do AB and BA have same minimal polynomial
?
(2) Let A be an n × n matrix. Show that A and AT have same eigen values. Do they
have the same eigen vectors ?
(3) Find the characteristic and minimal polynomial of the following matrix and decide
if this matrix is diagonalizable.
5 −6 −6
A = −1 4 2
3 −6 −4
−1 2 0
(4) Find the inverse of the matrix 1 1 0 using the Cayley-Hamilton theorem.
2 −1 2
1 2 0
(5) Diagonalize A = 2 1 0 and compute A2019 .
0 0 −3
(6) Let W be the subspace of R4 spanned by {u1 = (1, 1, 1, 1), u2 = (2, 4, 1, 5), u3 =
(2, 0, 4, 0)}. Using the standard Euclidean inner product on R4 find an orthogonal
basis for W .
R1
(7) Consider P2 (R) together with inner product: hp(x), q(x)i = 0 p(x)q(x) dx. Find
an orthogonal basis for P2 (R).
(8) Is the following matrix orthogonally diagonalizable ? If yes, then find P such that
P AP T is diagonal.
1 1 1
A = 1 1 1
1 1 1
−2 2
(9) Find a singular value decomposition of the matrix A = −1 1 .
2 −2
(10) Let Mn×n be the vector space of all real n × n matrices. Show that hA, Bi =
T r(AT B) is an inner product on Mn×n . Show that the orthogonal complement of
the subspace of symmetric matrices is the subspace of skew-symmetric matrices,
i.e., {A ∈ Mn×n | A is symmetric}⊥ = {A ∈ Mn×n | A is skew-symmetric}.
1
ASSIGNMENT 6
MTH102A
(1) Let A and B be square matrices of same order. Prove that characteristic poly-
nomials of AB and BA are same. Do AB and BA have same minimal polynomial ?
Solution: If one of them invertible, say A is invertible then A−1 (AB)A = BA.
So AB and BA being similar have same characteristic polynomial. Suppose none
of them is invertible.
Define two matrices
! n × n as follows,
C and D of order !
xIn A In 0
C = and D = , where In is the identity matrix of order
B In −B xIn
n and x is an indeterminate.
Now check that ,
as det(CD)=det(DC) we get
det(xIn − AB) = det(xIn − BA).
So the characteristic
! polynomials of AB! and BA are same.
0 1 0 0
Let A = and B = . Then AB = A whereas BA is the zero
0 0 0 1
matrix. Since A2 = 0 and A 6= 0, the minimal polynomial of AB is x2 whereas the
minimal polynomial of BA is x.
(2) Let A be an n × n matrix. Show that A and AT have same eigen values. Do they
have the same eigen vectors ?
Solution: det(AT − λI) = det(AT − (λI)T ) = det((A − λI)T ) = det(A − λI).
So they have same
! characteristic polynomial and therefore same eigenvalues.
1 1
Let A = . Then 1 is an eigenvalue of A and AT but the eigenvectors with
0 1
! !
1 0
respect to the eigen value 1 are and respectively.
0 1
(3) Find the characteristic and minimal polynomial of the following matrix and decide
if this matrix is diagonalizable.
1
2 ASSIGNMENT 6 MTH102A
5 −6 −6
A = −1 4 2
3 −6 −4
Solution: The characteristic polynomial is fA (x) = det(xI − A).
x−5 6 6
Here xI − A = 1 x − 4 −2
−3 6 x+4
2
So fA (x) = (x − 1)(x − 2) . The minimal polynomial is by definition is the
smallest degree monic polynomial m(x) such that m(A) = 0.
We know that m(x) divides fA (x) and they have the same roots. So the possi-
bilities for m(x) are (x − 1)(x − 2) and (x − 1)(x − 2)2 .
Since (A − I)(A − 2I) = 0 the minimal polynomial is (x − 1)(x − 2). Since
the minimal polynomial is a product of distinct linear factors, the matrix A is
diagonalizable.
−1 2 0
(4) Find the inverse of the matrix 1 1 0 using the Cayley-Hamilton theorem.
2 −1 2
−1 2 0
Solution: The matrix A is: A = 1 1 0 ,
2 −1 2
So the characteristic polynomial
pA (λ) is
−1 − λ 2 0
pA (λ) = det(A − λI) = det 1 1−λ 0 = (−1 − λ)(1 − λ)(2 − λ) −
2 −1 2 − λ
2(2 − λ) = (λ − 1)(2 − λ) − 4 + 2λ = −λ3 + 2λ2 + 3λ − 6,
2
0 0 −3
Solution: The characteristic polynomial is
1 2 0 λ 0 0
det(A − λI) = det 2 1 0 − 0 λ 0
0 0 −3 0 0 λ
1−λ 2 0
= det 2 1−λ 0
0 0 −3 − λ
.
ASSIGNMENT 6 MTH102A 3
0 0 0 x3 0
1 0 0 x1 0
0 1 0 x2 = 0
0 0 0 x3 0
0
Here we can clearly see that all solutions x to this system are of the form t 0,
1
0
for some scalar t. Thus v1 = 0 is the eigenvector associated with the eigenvalue
1
λ = −3 of the matrix A.
Following
the exactsame procedure: we see that the other two eigenvectors are
−1 1
v2 = 1 and v3 = 1 with respect to the eigen values −1 and 3 respectively.
0 0
0 −1 1 −3 0 0
Let P = 0 1 1. Then P −1 AP = 0 −1 0.
1 0 0 0 0 3
0 0 1
−1 1 1
We have P = − 2 2 0.
1 1
2 0
2
−3 0 0 −32019 0 0
Then A = P 0 −1 0 P −1 . Hence A2019 = P 0 −1 0 P −1 .
0 0 3 0 0 32019
Multiplying we get the answer.
(6) Let W be the subspace of R4 spanned by {u1 = (1, 1, 1, 1), u2 = (2, 4, 1, 5), u3 =
(2, 0, 4, 0)}. Using the standard Euclidean inner product on R4 find an orthogonal
basis for W .
Solution: It is an inductive process, so first let’s define:
v1 := u1 = (1, 1, 1, 1).
Then, by Gram-Schmidt orthogonalization process:
hu2 , v1 i
v2 : = u2 − v1
hv1 , v1 i
2+4+1+5
= u2 − v1 = (2, 4, 1, 5) − 3(1, 1, 1, 1)
4
= (−1, 1, −2, 2).
4 ASSIGNMENT 6 MTH102A
and finally
hu3 , v2 i hu3 , v1 i
v3 = u3 − v2 − v1
hv2 , v2 i hv1 , v1 i
−10 6
= u3 − v2 − v1
10 4
3
= (2, 0, 4, 0) + (−1, 1, −2, 2) − (1, 1, 1, 1)
2
1 1 1 1
= (− , − , , )
2 2 2 2
(7) Consider P2 (R) together with inner product:
Z 1
hp(x), q(x)i = p(x)q(x) dx.
0
Find an orthogonal basis for P2 (R).
Solution: The standard basis for P2 (R) is B = {1, x, x2 }.
Using the Gram-Schmidt process:
Let v1 = 1
Let
hx, v1 i hx, 1i
v2 = x − Pv1 (x) = x − · v1 = x − ·1
hv1 , v1 i h1, 1i
R1
Since hp(x), q(x)i = 0 p(x)q(x) dx,
Z 1 Z 1
1
hx, 1i = x · 1 dx = x dx =
0 0 2
Z 1 Z 1
h1, 1i = 1 · 1 dx = 1 dx = 1
0 0
1
⇒ v2 = x −
2
hx2 , v1 i hx2 , v2 i 1
v3 = x2 − Pv1 (x2 ) − Pv2 (x2 ) = x2 − · v1 − · v2 = x2 − x + .
hv1 , v1 i hv2 , v2 i 6
So the set {1, x − 12 , x2 − x + 16 } is an orthogonal basis for P2 (R).
(8) Is the following matrix orthogonally diagonalizable ? If yes, then find P such that
P AP T is diagonal.
1 1 1
A = 1 1 1
1 1 1
Solutions: Since the matrix A is symmetric, we know that it can be orthog-
onally diagonalized. We first find its eigenvalues by solving the characteristic
equation:
ASSIGNMENT 6 MTH102A 5
1−λ 1 1 λ1 = 0
2
0 = det(A − λI) = 1 1−λ 1 = −(λ − 3)λ =⇒ λ =0
2
1 1 1−λ λ3 = 3
1 1 1 0 1 1 1 0 s 1 0
1 1 1 0 =⇒ 0 0 0 0 =⇒ x = t = s 0 + t 1
1 1 1 0 0 0 0 0 −s − t −1 −1
1 0
So 0 and 1 are the eigen vectors with respect to the eigen value 0.
−1 −1
By orthonormalizing them, we obtain
1 1 −1
1
√ 0 , √ 2
2
6
−1 −1
We finally find the eigenvector corresponding to λ = 3:
−2 1 1 0 0 −3 3 0 0 −1 1 0 s 1
1 −2 1 0 =⇒ 1 −2 1 0 =⇒ 1 −1 0 0 =⇒ x = s = s 1
1 1 −2 0 0 3 −3 0 0 0 0 0 s 1
By normalizing it, we obtain
1 1
√ 1
3
1
Hence A is orthogonally diagonalized by the orthogonal matrix
√ √ √
1/ 2 −1/ 6 1/ 3
√ √
P = 0 2/ 6 1/ 3
√ √ √
−1/ 2 −1/ 6 1/ 3
Furthermore,
0 0 0
P T AP = 0 0 0
0 0 3
−2 2
(9) Find the singular value decomposition of the matrix A = −1 1 .
2 −2
6 ASSIGNMENT 6 MTH102A
Solution: Recall that the singular values of A are the square roots of the
nonzero eigenvalues of AT A (or AAT ). In this case
!
9 −9
AT A =
−9 9
√
and the eigenvalues of AT A are 0 and 18, so the only singular value is 18.
To find the matrix V , we need to find an eigenvectors for AT A and normalize
!
−1
them. For the eigenvalue λ = 18 an normalized eigenvector is √12 . For
1
!
1
λ = 0 an eigenvector of √12 and so
1
!
1 −1 1
V =√ .
2 1 1
Σ is the 3 × 2 matrix whose diagonal is composed of the singular values
√
18 0
Σ= 0 0 .
0 0
Finally AV = U Σ and the colums of U are the eigenvectors of AAT , solving this
system of equations you get that
2 √32 − √12
1 √
U= 1 0 8 .
3 3 1
−2 √2 √
2
We have A = U ΣV T .
(10) Let Mn×n be the vector space of all real n × n matrices. Show that hA, Bi =
T r(AT B) is an inner product on Mn×n . Show that the orthogonal complement of
the subspace of symmetric matrices is the subspace of skew-symmetric matrices,
i.e., {A ∈ Mn×n | A is symmetric}⊥ = {A ∈ Mn×n | A is skew-symmetric}.
Solution: Showing hA, Bi = T r(AT B) is an inner product is easy.
Let A be symmetric and B be skew-symmetric. First we need to prove that
hA, Bi = 0.
hA, Bi = T r(AT B) = T r(AB) = T r(BA) = T r(−B T A) =h−B, Ai =
−hA, Bi.SohA, Bi = 0.
Note that hA, Bi = T r(AT B) = i,j aij bij . Let Eij be the zero matrix except
P