0% found this document useful (0 votes)
88 views17 pages

Understanding Matrices and Their Types

The document provides a comprehensive overview of matrices, including their definitions, types, operations, and properties. It covers various special types of matrices such as square, null, unit, diagonal, and symmetric matrices, as well as operations like addition and multiplication, and their respective properties. Additionally, it discusses the concepts of matrix equality, trace, transpose, and special matrices like Hermitian and orthogonal matrices.

Uploaded by

veergupta14007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
88 views17 pages

Understanding Matrices and Their Types

The document provides a comprehensive overview of matrices, including their definitions, types, operations, and properties. It covers various special types of matrices such as square, null, unit, diagonal, and symmetric matrices, as well as operations like addition and multiplication, and their respective properties. Additionally, it discusses the concepts of matrix equality, trace, transpose, and special matrices like Hermitian and orthogonal matrices.

Uploaded by

veergupta14007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Matrices and Determinants

1. DEFINITION OF A MATRIX
A set of mn numbers (real or complex) arranged in the form of a rectangular array having
'm' rows & 'n' columns is called an m × n matrix. [to be read as m by n matrix].
It is usually written as
 a11 a12  a1n 
a a 22  a 2 n 
A =  21
    
 
a m1 a m 2  a mn 
It is represented as A = [aij]m × n
The numbers a11, a12… etc are called elements of the matrix. The element aij belongs to ith
row and jth column.
3 2 7 
e.g. 
A = 5 − 4 6  is a 3 × 3 matrix.
4 8 − 12

2. SPECIAL TYPES OF MATRICES


(i) Square Matrix
An m × n matrix for which m = n(i.e. the number of rows is equal to the number of
columns) is called a square matrix of order n. The element aij of a square matrix
A = [aij]n × n for which i = j i.e., the elements a11, a22,…,ann are called the diagonal
elements.
The matrix
0 1 2 3
2 3 1 0
A=   is a square matrix of order 4.
5 0 1 1 
 
0 0 1 2
The elements 0, 3, 1, 2 are the diagonal elements of A.
(ii) Null Matrix or Zero Matrix
The m × n matrix whose all elements are zero is called a null matrix of order
m × n. It is usually denoted by O.

(iii) Unit Matrix or Identity Matrix


A square matrix each of whose principal diagonal element is '1' and each of whose
non−diagonal element is equal to zero is called a unit matrix or an identity matrix and
is denoted by I. In will denote a unit matrix of order n
1 0 0
1 0
e.g. I3 = 0 1 0 & I2 =  
0 0 1 0 1 

(iv) Scalar Matrix

Page 1
A diagonal matrix whose diagonal elements are all equal is called a scalar matrix
k 0 0
e.g. A = 0 k 0 
0 0 k 

(v) Diagonal Matrix


A square matrix A = [aij]n × n is called a diagonal matrix if aij = 0 for all i ≠ j, i.e.,
 a11 0 0  0 
0 a22 0  0 
A=   (also represented as diag. (a11, a22, … , ann)).
     
 
0 0 0  ann 

(vi) Upper Triangular Matrix


A square matrix A = [aij]n×n is called an upper triangular matrix if aij = 0 whenever i > j
 a11 a12 a13  a1n 
0 a22 a23  a2 n 
i.e., e.g. A =  
     
 
0 0 0  ann 

(vii) Lower Triangular Matrix


A square matrix A = [aij] is called a lower triangular matrix if aij = 0 whenever i < j
i.e.,
 a11 0 0  0
a a22 0  0
A =  21 
    
 
 an1 an 2 an 3  anx  n×n

(viii) Row Matrix


Any 1 × n matrix which has only one row & n columns is called a row matrix.
e.g. X = [2 7 − 8 5]1× 4 is a row matrix.
(ix) Column Matrix
Any m × 1 matrix which has only one column & m rows is called a column matrix.
 1
e.g. Y = 3 is a column matrix.
5
3× 1

3. EQUALITY OF TWO MATRICES


Two matrices A = [aij] and B = [bij] are said to be equal if
(i) they are of the same order
and (ii) the elements in the corresponding places of the two matrices are the same i.e. aij =
bij for each pair of subscripts of i and j. If two matrices A and B are equal, we
write A = B.

Page 2
4. ADDITION OF MATRICES

Let A and B be two matrices of the same order m × n. Then their sum is defined to be the
matrix of order m × n obtained by adding the corresponding elements of A and B.
a a  b b  a +b a +b 
e.g. If A =  11 12  and B =  11 12  then A + B =  11 11 12 12  .
 a21 a22  b21 b22   a21 + b21 a22 + b22 

5. PROPERTIES OF MATRIX ADDITION

(i) Matrix addition is commutative


If A and B be two m × n matrices, then A + B = B + A.

(ii) Matrix addition is associative


If A, B, C be three matrices each of the order m × n, then (A + B) + C = A + (B + C)

(iii) Existence of additive identity


If O be the m × n null matrix, then A + O = A = O + A for every m × n matrix A. O is
called additive identity.

(iv) Existence of the additive inverse


Let A = [aij]m × n. Then the negative of the matrix A is defined as the matrix
[−aij]m × n and is denoted by −A, called the additive inverse of A,
as A + (–A) = (–A) + A = O.

6. MULTIPLICATION OF TWO MATRICES

Let A = [aij]m × n and B = [bjk]n × p be two matrices such that the number of columns in A is
equal to the number of rows in B. Then the m × p matrix C = [Cik]m×p,
n
where Cik = ∑a j=1
ij b jk , is called the product of the matrices A and B in that order and we

write C = AB.

7. PROPERTIES OF MATRIX MULTIPLICATION

(i) Matrix multiplication is associative


i.e. (AB)C = A(BC), if A, B, C are m × n, n × p, p × q matrices respectively.

(ii) Multiplication of matrices is distributive over addition of matrices.


i.e., A(B + C) = AB + AC

(iii) Existence of multiplicative identity of square matrices.


If A is a square matrix of order n and In is the identity matrix of order n,
then A In = In A = A.

(iv) Whenever AB and BA both exist it is not necessary that AB = BA.


1 0  0 1 
e.g. A=   , B = 1 0 
 0 − 1  

Page 3
1 0  0 1  0 1
then AB =     = − 1 0 
 0 − 1  1 0   
0 1   1 0   0 − 1
BA =     = 1
 1 0   0 − 1  0 
Thus AB ≠ BA

(v) The product of two matrices can be a zero matrix while neither of them is a zero
matrix.
0 1  1 0  0 0 
e.g. If A =   and B = 0 0  then AB = 0 0  while neither A nor B is a
0 0     
null matrix.

(vi) In the case of matrix multiplication if AB = 0, then it doesn't necessarily imply that
A = 0 or B = 0 or BA = 0.

8. TRACE OF A MATRIX

Let A be a square matrix of order n. The sum of the diagonal elements of A is called the
trace of A.
n
∴ trace (A) = ∑ a ii = a11 + a22 + … + ann.
i =1

9. TRANSPOSE OF A MATRIX

Let A = [aij]m × n. Then n × m matrix obtained from A by changing its rows into columns
and columns into rows is called the transpose of A and is denoted by A′ or by AT.
 1 2 3 1 4 7 
e.g If A =  4 5 6  , then AT = 2 5 8 
 
7 8 9   3 6 9 

10. TRANSPOSED CONJUGATE OF A MATRIX


The transpose of the conjugate of a matrix A is called transposed conjugate of A and it is
denoted by A*.
1 + 2 i 2 − 3i 3 + 4i  1 − 2 i 4 + 5i 8 
e.g. If A =  4 − 5i 5 + 6i 6 − 7 i  then ( A )′ = 2 + 3i 5 − 6i 7 − 8i  = A*
 
 8 7 + 8i 7   3 − 4i 6 + 7 i 7 

11. PROPERTIES OF TRANSPOSE AND CONJUGATE TRANSPOSE OF A


MATRIX
(i) (A′)′ = A, (A*)* = A
(ii) (A + B)′ = A′ + B′, (A + B)* = A* + B*
(iii)(kA)′ = kA′, (kA)* = k A*, k being a scalar.
(iv) (AB)′ = B′A′, (AB)* = B* A*

Page 4
12. SOME MORE SPECIAL TYPE OF MATRICES
(i) Symmetric Matrix
A matrix A = [aij] n×n is symmetric if A = A′. Note that aij = aji for such a matrix,
∀ 1 ≤ i, j ≤ n.

(ii) Skew Symmetric Matrix


A matrix A = [aij]n×n is skew symmetric if A = – A′. Note that aij = –aji for such a
matrix, ∀ 1 ≤ i, i ≤ n. If i = j, then aii = –aii ⇒ aii = 0. Thus in a skew symmetric matrix
diagonal entries are zeros.
Note that every square matrix can be written (uniquely) as the sum of a symmetric and
a skew symmetric matrix, i.e., A = ( A + A′) + ( A − A′) .
1 1
2 
symmetric
 2   
Skew symmetric

(iii)Hermitian Matrix
A matrix A = [aij]n×n is hermitian if A* = A. Note that aij = a ji for such a matix. Thus
in a skew hermitian matrix aii = a ii ⇒ diagonal entries of a hermitian matrix are real.

(iv) Skew Hermitian Matrix


A matrix A = [aij]n×n is skew hermitian if A* = –A.
Note that aij = – a ji. Thus in a hermitian matrix aii = – a ii ⇒ diagonal entries of a skew
hermitian matrix are either zero or purely imaginary.
Note that every square matrix can be written (uniquely) as the sum of a hermitian and
a skew–hermitian matrix i.e., A = ( A + A *) + ( A − A *) .
1 1
2   2 
Hermitian
 
Skew Hermitian

(v) Orthogonal Matrix


A matrix A = [aij]n×n is orthogonal if AA′ = In. Thus in a 3 × 3 orthogonal matrix rows
(columns) are forming orthogonal system of unit vectors and vice versa. For example
1
(i + j ), 1 (i − j ) and k̂ are forming orthogonal system of unit vectors. The
2 2
 1 1 
 2 0
2 

1 1
corresponding matrix is A =  − 0 .
 2 2 
 
 0 0 1
 
Note that AA′ = I3. Thus A is an orthogonal matrix.

(vi) Unitary Matrix


1  1 1 + i
A matrix A = [aij]n×n is unitary if AA* = In. For example the matrix   is
3 1 − i − 1 
unitary.

Page 5
DETERMINANT
13. Equations a1x + b1y = 0 and a2x + b2y = 0 in x and y have a unique solution if a1b2−a2b1 ≠
a1 b1
0. we write a1b2 – a2b1 as and call it a determinant of order 2.
a2 b2
Similarly equations a1x + b1y + c1z = 0, a2x + b2y + c2z = 0 and
a3x + b3y + c3z = 0 have a unique solution if a1(b2 c3 – b3c2) + b1 (a3c2 – a2c3)
a1 b1 c1
+ c1(a2b3 – a3b2) ≠ 0 i.e., a2 b2 c2 ≠ 0.
a3 b3 c3
The numbers ai, bi, ci (i = 1, 2, 3) are called the elements of the determinant.

The determinant obtained by deleting the ith row and jth column is called the minor of the
element at the ith row and jth column. We shall denote it by Mij. The cofactor of this
element is (−1)i + j (minor). i.e (-1)i + j Mij, denoted by Cij.
a11 a12 a13
Let A = [aij]3 × 3 be a matrix, then the corresponding determinant D is a21 a22 a23 . It is
a31 a32 a33
easy to see that D = a11 C11 + a12C12 + a13C13 (we say that we have expanded determinant
D along first row). Infact value of D can be obtained by expanding it along any row or
along any column. Further note that if elements of a row (column) are multiplied to the
cofactors of other row (column) and then added, then the result is zero.

14. PROPERTIES OF DETERMINANTS


(i) The determinant remains unaltered if its rows are changed into columns and the
columns into rows.
a 1 b 1 c1 a1 a 2 a3
a2 b2 c 2 = b1 b2 b3 . Thus any property true for rows will also be true for
a 3 b3 c3 c1 c2 c3
columns.

(ii) If all the elements of a row (or column) are zero, then the determinant is zero.
0 b 1 c1 a1 b1 c1
e.g., 0 b 2 c 2 = 0, 0 0 0 = 0
0 b3 c3 a3 b3 c3

(iii) If any two rows or any two columns of a determinant are identical, then the
determinant is zero.
a1 a1 c1
a2 a2 c2 = 0
a3 a3 c3

Page 6
(iv) The interchange of any two rows (columns) of the determinant results in change of
it's sign i.e.,
a1 b1 c1 b1 a1 c1
a2 b2 c 2 = − b2 a2 c2
a3 b3 c3 b3 a3 c3

(v) If all the elements of a row (column) of a determinant are multiplied by a non-zero
constant, then the determinant gets multiplied by that constant.
a1 kb1 c1 a1 b1 c1 a1 b1 c1 ka1 kb1 kc1
e.g., a 2 kb2 c2 = k a2 b2 c2 and k a2 b2 c2 = a2 b2 c2
a3 kb3 c3 a3 b3 c3 a3 b3 c3 a3 b3 c3

(vi) A determinant remains unaltered under a column operation of the form


Ci → αCj + βCk (j, k ≠ i) or a row operation of the form Ri → αRj + βRk(j, k ≠ i).
a1 b1 c1 a1 b1 + 2 a1 + 3c1 c1
e.g. a 2 b2 c2 = a2 b2 + 2 a 2 + 3c 2 c 2 obtained after C2 → 2C1 + 3C3.
a3 b3 c3 a3 b3 + 2 a 3 + 3c 3 c3

(vii) If each element of a row (column) of a determinant is a sum of two terms, then
determinant can be written as sum of two determinant in the following way:
a1 b1 c1 + d 1 a1 b1 c1 a1 b1 d1
a2 b2 c2 + d 2 = a2 b2 c2 + a2 b2 d2
a3 b3 c3 + d 3 a3 b3 c3 a3 b3 d3

∑ f (r ) ∑ g (r ) ∑ h(r )
n n n
f (r ) g (r ) h(r )
n r =1 r =1 r =1
In general ∑ a 2 b2 c2 = a2 b2 c2 .
r =1
a3 b3 c3 a3 b3 c3

(viii) Product of two determinants


a1 b1 c1 l1 l2 l3
a2 b2 c 2 × m1 m2 m3
a3 b3 c3 n1 n2 n3
a1l1 + b1m1 + c1n1 a1l2 + b1m2 + c1n2 a1l3 + b1m3 + c1n3
= a2l1 + b2 m1 + c2 n1 a2l2 + b2 m2 + c2 n2 a2l3 + b2 m3 + c2 n3
a3l1 + b3 m1 + c3 n1 a3l2 + b3 m2 + c3 n2 a3l3 + b3 m3 + c3 n3
(row by column multiplication)
a1l1 + b1l2 + c1l3 a1m1 + b1m2 + c1m3 a1n1 + b1n2 + c1n3
= a2l1 + b2l2 + c2l3 a2 m1 + b2 m2 + c2 m3 a2 n1 + b2 n2 + c2 n3
a3l1 + b3l2 + c3l3 a3m2 + b3m2 + c3m3 a3n1 + b3n2 + c3n3
(row by row multiplication)

Page 7
We can also multiply determinants column by row or column by column.

(ix) Limit of a determinant


f ( x ) g ( x ) h( x ) lim f ( x ) lim g ( x ) lim h( x )
x→a x→a x→a
Let ∆(x ) = l (x ) m(x ) n(x ) , then lim ∆( x ) = lim l ( x ) lim m( x ) lim n( x ) ,
x→a x→a x→a x→a
u (x ) v(x ) w(x ) lim u ( x ) lim v( x ) lim w( x )
x→a x→a x→a

provided each of nine limiting values are existing finitely.

(x) Differentiation of a determinant


f ( x ) g ( x ) h( x )
Let ∆(x) = l (x ) m(x ) n(x ) ,
u (x ) v(x ) w(x )
f ′(x ) g ′(x ) h ′(x ) f ( x ) g ( x ) h( x ) f ( x ) g ( x ) h( x )
then ∆′(x) = l (x ) m(x ) n(x ) + l ′(x ) m ′(x ) n ′(x ) + l ( x ) m( x ) n( x )
u (x ) v(x ) w(x ) u (x ) v(x ) w(x ) u′(x ) v′( x ) w′(x )

(xi) Integration of a determinant


f ( x ) g ( x ) h( x )
Let ∆(x) = a b c , where a, b, c, l, m and n are constants.
l m n

∫ f (x )dx ∫ g (x )dx ∫ h(x )dx


b b b

a a a
∫ ∆(x ) dx =
b
then a b c .
a
l m n

Note that if more than one row (column) of ∆(x) are variable, then in order to find
b

∫ ∆(x )dx
a
either we evaluate the determinant ∆(x) or by using the properties of

determinants we write ∆(x) in suitable form for integration.

15. SPECIAL DETERMINANTS


(i) Symmetric determinant
The elements situated at equal distance from the diagonal are equal both in
magnitude and sign. i.e. (i, j)th element = (j, i)th element.
a h g
h b f = abc + 2 fgh − af 2 − bg 2 − ch 2 .
g f c

(ii) Skew symmetric determinant


All the diagonal elements are zero and the elements situated at equal distance from
the diagonal are equal in magnitude but opposite in sign. The value of a
skew symmetric determinant of odd order is zero.

Page 8
0 b −c
i.e., − b 0 a =0
c −a 0

(iii) Circulant determinant


The elements of the rows (or columns) are in cyclic arrangement.
a b c
b c a = −(a 3 + b3 + c3 − 3abc)
c a b

1 1 1
(iv) a b c = (a − b)(b − c)(c − a )
a2 b2 c2

1 1 1
(v) a b c = (a − b)(b − c)(c − a)(a + b + c)
a3 b3 c3

1 1 1
(vi) a 2
b 2
c 2 = (a − b)(b − c)(c − a)(ab + bc + ca)
a3 b3 c3

16. ADJOINT OF A SQUARE MATRIX

Let A = [aij]n × n be any n × n matrix. The transpose B′ of the matrix B = [Aij]n × n, where Aij
denotes the cofactor of the element aij in the determinant |A|, is called the adjoint of the
matrix A and is denoted by the symbol Adj A.
Thus the adjoint of a matrix A is the transpose of the matrix formed by the
cofactors of A i.e. if
 a11 a12  a1n   A11 A21  A1n 
a a22  a2 n  A A22  An 2 
A =  21  then Adj A =  12 
         
   
 an1 an 2  ann   A1n A2 n  Ann 
It is easy to see that A(adjA) = (adjA)A = |A|.In.

17. INVERSE OF A SQUARE MATRIX


Let A be any n − rowed square matrix. Then a matrix B, if exists, such that AB = BA = In,
is called the inverse of A. Inverse of A is usually denoted by A–1 (if exists). We have
|A| In = A(adjA) ⇒ |A| A–1 = (adjA). Thus the necessary and sufficient condition for a
Adj( A)
square matrix A to possess the inverse is that |A| ≠ 0 and then A−1 = . A square
| A|

Page 9
matrix A is called non–singular if |A| ≠ 0 . Hence a square matrix A is invertible if and
only if A is non–singular.

18. SYSTEM OF LINEAR SIMULTANEOUS EQUATIONS


System of three linear equations with three unknowns is a1x + b1y + c1z = d1,
a2x + b2y + c2z = d2, a3x + b3y + c3z = d3
• If system of equations has no solution, then it is called inconsistent.
• If system of equations has at least one solution, then it is called consistent.
• If (d1, d2, d2) = (0, 0, 0), then the system is called homogenous, otherwise
non–homogenous.

Page 10
Matrices and Determinants
 cos θ sin θ 
1. If E(θ) =   , then E(α) E(β) is equal to
− sin θ cos θ
(a) E(0) (b) E(αβ)
(c) E(α + β) (d) E(α − β)
 cos α 2
cos α sin α   cos 2 β cos β sin β
2. If A =   and B =  
cos α sin α sin α  cos β sin β sin 2 β 
2

are two matrices such that the matrix AB is the null matrix, then α − β is
(a) 0 (b) a multiple of π
(c) an odd multiple of π/2 (d) none of these
1 3 1 1 
3. The matrix A satisfying the equation   A = 0 −1 , is
 0 1  
 1 4  1 − 4 
(a)   (b)  
− 1 0 0 − 1 
1 4 
(c)   (d) none of these
0 −1
1 0  0 1  cos θ sin θ 
4. If I =   , J =  − 1 0 and B = − sin θ cos θ , then B equals
0 1     
(a) I cosθ + J sinθ (b) I sinθ + J cosθ
(c) I cosθ − J sinθ (d) −I cosθ + J sinθ

5. Let A be a square matrix of order n & k be a scalar. Then |kA| equals


(a) k|A| (b) |k| |A|
n
(c) k |A| (d) none of these
6. If A is an invertible matrix, then det (A−1) is equal to
1
(a) det A (b)
det A
(c) 1 (d) 2
a h g   x
7. The order of the matrix [ x y z ]  h b f   y  is
   
 g f c   z 
(a) 3 × 1 (b) 1 × 1
(c) 1 × 3 (d) 3 × 3
8. If A and B are two matrices such that AB = B and BA = A, then A2 + B2 is equal to
(a) 2AB (b) 2BA
(c) A + B (d) AB
1 1
 and n ∈ N, then A is equal to
n
9. If A = 
1 1
n
(a) 2 A (b) 2n − 1A
(c) nA (d) none of these

Page 11
10. If a, b and c are complex number, then
0 −b −c
z= b 0 − a is
c a 0
(a) purely real (b) purely imaginary
(c) 0 (d) none of these
11. If a, b, c are distinct non−zero real numbers and
1 a2 a3
1 b 2 b 3 = 0, then
1 c2 c3
(a) 1/a + 1/b + 1/c = 0 (b) a + b + c =0
(c) a3 + b3 + c3 = 3abc (d) none of these
1 + x 1 y1 1 + x 1 y 2 1 + x1y3
12. Let ∆ = 1 + x 2 y1 1 + x 2 y 2 1 + x 2 y 3 then ∆ is equal to
1 + x 3 y1 1 + x 3 y 2 1 + x 3 y 3
(a) x1 x2 x3 + y1 y2 y3 (b) x1 x2 x3 y1 y2 y3
(c) x2 x3 y2 y3 + x3 x1 y3 y1 + x1 x2 y1 y2 (d) none of these
a a+b a+b+c
13. If 2 a 3a + 2b 4 a + 3b + 2c = 64, then 'a' is equal to
3a 6 a + 3b 10 a + 6 b + 3c
(a) 2 (b) 3
(c) 4 (d) 6
a a+b a+b+c
14. Let ∆ = 3a 4 a + 3b 5 a + 4b + 3c , where a = i, b = w and c = w2 (w being an imaginary
6 a 9 a + 6 b 11a + 9b + 6 c
cube root of unity); then ∆ equals
(a) w (b) −w2
(c) i (d) −I
15. Let {A1, A2, A3, …, Ak} be the set of 3 × 3 matrices that can be made with the distinct
nonzero real numbers a1, a2, a3, …, a9. Then k =
(a) 99 (b) 9 !
(c) 9 (d) 1
16. If the system of equations
ax + by + c = 0
bx + cy + a = 0
cx + ay + b = 0
in x and y has a solution, then the system of equations
(b + c)x + (c + a)y + (a + b)z = 0
(c + a)x + (a + b)y + (b + c)z = 0
(a + b)x + (b + c)y + (c + a)z = 0
has
(a) only one solution (b) no solution
(c) infinite number of solutions (d) none of these

Page 12
17. The equations 2x + y = 5, x + 3y = 5, x − 2y = 0 have
(a) no solution (b) one solution
(c) two solutions (d) infinitely many solutions
18. If A is a square matrix, then adj (AT) – (Adj A)T is equal to
(a) 2 (adj A) (b) 2|A|I
(c) null matrix (d) Unit matrix
19. If A is 3 × 4 matrix and B is a matrix such that A′B and BA′ are both defined, then B is of
the type
(a) 3 × 4 (b) 3 × 3
(c) 4 × 4 (d) 4 × 3
20. If A and B are square matrices of order 3 such that |A| = –1, |B| = 3, then |3 AB| equals
(a) –9 (b) –81
(c) –27 (d) 81
0 1 
21. If A =   , I is the unit matrix of order 2 and a, b are arbitrary constants, then
0 0 
(aI + bA)2 is equal to
(a) a2I + abA (b) a2I + 2abA
2 2
(c) a I + b A (d) none of these
x + λ x x 
22. Let A =  x x+λ x  , then A–1 exists if
 x x x + λ 
(a) x ≠ 0 (b) λ ≠ 0
(c) 3x + λ ≠ 0, λ ≠ 0 (d) x ≠ 0, λ ≠ 0
α 2 
23. If A =   and | A 3 |= 125 , then α =
 2 α
(a) ± 3 (b) ± 2
(c) ± 5 (d) 0
24. The value of λ for which the system of equations 2x – y – z = 12, x – 2y + z = –4,
x + y + λz = 4 has no solution is
(a) 3 (b) –3
(c) 2 (d) –2
25. If AB = A and BA = B, where A and B are square matrices, then
(a) B2 = B and A2 = A (b) B2 = A and A2 = B
(c) AB = BA (d) none of these

26. If A = ,B= ,

then 3A - 4B is equal to :

(a) (b)

(c) (d)

Page 13
27. The number of all possible matrices of order 3 x 3 with each entry 0 or 1 is equal to:
(a)18 (b) 81
(c) 512 (d) none of these
28. If A is an orthogonal matrix, then A-1 is equal to:
(a) A (b) AT
(c) A2 (d) none of these
29. If A and B are two invertible matrices, then the inverse of AB equal to:
(a) AB (b) BA
(c) A-1B-1 (d) B-1A-1
30. If A and B are two square matrices such that AB = 0, then:
(a) det A = 0 or det B = 0 (b) det B = 0
-1
(c) B = A (d) det A = 0
31. If A = , i = √-1, then An is equal to :
(a) A for n = 4 (b) - A for n = 6
(c) – I for n = 5 (d) I for n = 8
32. If In is identity matrix of order n, then (In) -1 :
(a) does not exist (b) equals to In
(c) equals to 0 (d) equals to nIn
33. If a matrix A is symmetric as well as skew-symmetric, then:
(a) A is a diagonal matrix (b) A is a null matrix
(c) A is a unit matrix (d) A is a triangular matrix
34. The matrix is of order:
(a) 3 2 (b) 2 3
(c) 6 9 (d) 2 l
35. If A, B are symmetric matrices of the same order, then the matrix AB - BA is:
(a) 0 (b) symmetric
(c) I (d) skew-symmetric
36. If A is a skew - symmetric matrix and n is a positive integer, then An is:
(a) a symmetric matrix (b) skew-symmetric matrix
(c) diagonal matrix (d) none of these
37. If A and B are matrices of the same order, then (A + B)2 = A2 + 2AB + B2 is possible if:
(a) AB = I (b) BA = I
(c) AB = BA (d) none of these
38. If I = = , then:
(a) a = 1, b = 1 (b) a = cos 2θ, b = sin 2θ
(c) a = sin 2θ, b = cos 2θ (d) none of the above
39. If A is a square matrix of order n n, then adj (adj A) is equal to:
(a) | A |n -1 A (b) | A |n A
(c) | A |n -2 A (d) none of these

40. The value of x for which [1 1 x] = 0 is equal to:

(a) 2 (b) - 2
(c) 3 (d) – 3
41. If A = , B= , and 4A – 3B + C = 0, then C is equal to:

Page 14
(a) (b)

(c) (d) none of these

42. The rank of the matrix is equal to:

(a) 1 (b) 2
(c) 3 (d) 4
43. If A and B are two matrices such that A + Band AB are both defined, then:
(a) A and B are two matrices not necessarily of same order
(b) A and B are square matrices of same order
(c) number of columns of A = number of rows of B.
(d) none of these .

44. If A = ,B= , then:

(a) AB, BA exist and are equal


(b) AB, BA exist but are not equal
(c) AB exists and BA does not exist
(d) AB does not exist and BA exists
45. If a matrix has 13, elements, then the possible dimensions (order) it can have are:
(a) 1 x 13, 13 x 1 (b) 1 x 26, 26 x 1
(c) 2 x 13, 13 x 2 (d) none of these .
46. The value of cos θ + sin θ is equal to:

(a) (b)

(c) (d) none of these


47. If A and B are two matrices such that AB = B and BA = A, then A2 + B2 is equal to:
(a) 2AB (b) 2BA
(c) A ÷ B (d) AB
48. If A = , then A4 is equal to:

(a) (b)

(c) (d)

49. If A = , the value of An is equal to:

(a) (b)

(c) (d) none of these

50. If A + B = and A – 2B = , then A is equal to:

(a) (b)

(c) (d) none of these

Page 15
51. If A = , then the value of X where A + X is a unit matrix, is

(a) (b)

(c) (d) none of these

52. If the matrix is singular, then λ is to:

(a) -2 (b) 4
(c) 2 (d) - 4
53. If A = , then A2 is equal to:

(a) (b)

(c) (d)

54. If A = , then A2 is equal to:

(a) A (b) –A
(c) Null matrix (d) I

55. The inverse of the matrix A = is equal to:

(a) A (b) A’

(c) (d)

Page 16
Objective

1. c 2. c 3. c 4. a 5. c
6. b 7. b 8. c 9. b 10. b
11. a 12. d 13. c 14. c 15. a
16. c 17. b 18. c 19. a 20. a
21. b 22. c 23. a 24. d 25. a
ANSWERS
1. A 2. C 3. B 4. D 5. A 6. C 7. B
8. B 9. A 10. D 11. D 12. C 13. B 14. C
15. B 16. B 17. B 18. B 19. B 20. A 21. A
22. C 23. A 24. D 25. C 26. A 27. B 28. B
29. D
30. A

Page 17

Common questions

Powered by AI

A skew-symmetric matrix of odd order always has a zero determinant because such a matrix has zero as a characteristic eigenvalue. This property implies that the matrix is singular, highlighting specific applications in mechanical systems and theoretical physics where zeros in determinants correspond to singularities or critical points in mechanical structures .

The adjoint of a matrix, being the transpose of the matrix of cofactors, plays a fundamental role in matrix inversion, complementing determinants for solving linear equations via Cramer's rule. The adjoint provides a method to calculate the inverse of a non-singular matrix, which is pivotal in many computational algorithms in linear algebra .

A determinant with identical rows or columns is zero because it indicates linear dependence among the rows or columns. The column space or row space does not span the full n-dimensional space required for a non-zero determinant, which mathematically results in a collapse to zero .

Transpose and conjugate transpose operations offer a comprehensive view of matrix symmetry and its impacts on matrix behavior. The transpose of a matrix addresses how its row and column spaces are interrelated, while the conjugate transpose is critical when dealing with Hermitian matrices in quantum mechanics and signal processing, providing insight into complex vector spaces .

The trace of a square matrix, being the sum of its diagonal elements, provides insight into the matrix’s eigenvalues since the trace is equal to the sum of eigenvalues. This property is crucial in simplifying problems involving matrices in linear algebra by allowing evaluation of certain matrix functions without computing all elements .

An orthogonal matrix A satisfies A^T = A^-1, meaning that its transpose is equivalent to its inverse. This property results in an orthogonal matrix preserving vector norms and angles, making it instrumental in numerical stability and transformations in Euclidean spaces .

Matrix multiplication property AB = 0 does not imply A = 0 or B = 0 because there exist non-zero matrices which, when multiplied, result in a zero matrix. This occurs due to the specific arrangement of entries in A and B such that their product in certain environments (like certain vector spaces) yields a zero matrix, showcasing the complexity of non-commutative multiplication .

Hermitian matrices, defined by the condition A* = A where A* is the conjugate transpose, are crucial in complex number contexts because they generalize the notion of real symmetric matrices to complex spaces. The eigenvalues of Hermitian matrices are always real, which is a pivotal property in quantum mechanics and functional analysis .

A matrix that is both symmetric (A = A') and skew-symmetric (A = -A') must be a null matrix because the conditions require each element to simultaneously equal its negative, which is only possible if the element is zero. This property is significant as it illustrates the interaction between two symmetrical matrix properties resulting in a unique case of a null matrix .

Symmetric determinants where elements equidistant from the diagonal are equal, and skew-symmetric determinants with zero diagonal elements capitalize on matrix symmetry properties. Symmetric determinants simplify the computation of eigenvalues, while skew-symmetric matrices often relate to determinants of rotational matrices in physics due to properties like odd order yielding zero .

You might also like