Applications of Matrices Multiplication
Applications of Matrices Multiplication
1. Introduction
This article aims to promote several geometric aspects of linear alge-
bra. The geometric motivation often leads to simple proofs in addition
to increasing the students’ interest of this subject and providing a solid
basis. Students with a confident grasp of these ideas will encounter lit-
tle difficulties in extending them to more abstract linear spaces. Many
geometrical operations can be rephrased in the language of vectors and
matrices. This includes projections, reflections and rotations, opera-
tions which have numerous applications in engineering, physics, chem-
istry and economic, and therefore their matrix’s representation should
be included in basic linear algebra course.
There are basically two attitudes in teaching linear algebra. The
abstract one which deals with formal definitions of vector spaces, lin-
ear transformations etc. Contrary to the abstract vector spaces, the
analytic approach deals mainly with the vector space Rn and provides
the basic concepts and proofs in these spaces.
However, when the the analytic approach deals with the definition of
linear transformations it uses the notion of representation of the matrix
of the transformation in arbitrary basis and thus it actually goes back
to the abstract setting. To clarify this issue, let us consider for example
the calculations a rotation T of a vector x in R3 . In this way one start
2010 Mathematics Subject Classification. Primary 15B10, 97H60; Secondary
15A04, 15A15.
Key words and phrases. Rotation matrix, matrices multiplication, determinant,
orthogonal matrices, equiangular rotations.
1
2 A. GOLDVARD AND L. KARP
(1) Ax = x1 a1 + x2 a2 + . . . + xn an .
where bT1 , . . . , bTn are rows vectors of the transpose B T . This type of
multiplication will be used in Section 4.
un vn zn
We are now using formula (1), the linearity of the determinant and the
fact that a matrix having two equal columns its determinant is zero.
All these result with
" n n n
#!
X X X
det([Au, Av, . . . , Az]) = det ui ai , vi ai , . . . , zi ai
i=1 i=1 i=1
X
= (up1 vp2 . . . zpn ) det([ap1 , ap2 , . . . , apn ]).
p
Since the determinant changes sign when two columns are interchanged,
we have
det([ap1 , ap2 , . . . , apn ]) = σ(p) det(A).
Therefore,
X
det([Au, Av, . . . , Az]) = det(A) σ(p)up1 vp2 . . . zpn
p
and hence det(R(α)) = ±1. Letting limα→0 R(α) = I and using the
continuity of the determinants, we see that det(R(α)) = 1.
We turn now to rotations in R4 . It turns out that rotations in R4
can be defined in a similar way to rotations in R2 and R3 . We say that
a linear transformation R is a rotation if there exists two dimensional
subspace Π of R4 such that the angle between vectors Rx and x is a
constant for all x ∈ Π, and the angle between vectors Ry and y is a
constant for all y ∈ Π⊥ , the orthogonal complement of Π.
The calculation are done in a similar manner as we did in R3 . Pick
a, b ∈ Π and c, d ∈ Π⊥ such that the set {a, b, c, d} is an orthonormal
basis. Let R = R(α, β) be the rotation matrix with rotation’s angles α
in the plane Π and β in the orthogonal complement Π⊥ . Then
Ra = cos αa + sin αb, Rb = − sin αa + cos αb,
Rc = cos βc + sin βd, Rd = − sin βc + cos βd.
APPLICATIONS OF MATRICES MULTIPLICATION 7
Set P = [a, b, c, d] and Q = [Ra, Rb, Rc, Rd], since the matrix P is
orthogonal, R = QP T and hence
R = (Ra)aT + (Rb)bT + (Rc)cT + (Rd)dT
= cos α aaT + bbT + sin α baT − abT
(10)
+ cos β ccT + ddT + sin β dcT − cdT .
=I,
R(α, β) is an orthogonal matrix and by letting α and β go to zero, we
get that its determinant is one.
We are now in position to extent the geometric definition of rotations
to Rn for arbitrary positive integer n. For n = 2p we say that a linear
transformation R is a rotation if there exist p mutual orthogonal planes
Πk such that the angle between the vectors Rx and x is a constant for
all x ∈ Πk , k = 1, ..., p. For n = 2p + 1 we require that there are p
mutual orthogonal planes Πk and in addition a line L is orthogonal to
Πk , k = 1, . . . , p such that R behaves the same as in the even on the
planes Πk and Ry = y for all y ∈ L. The extension of formulas (8) and
(10) to arbitrary dimension is obvious. In R2p there is an orthonormal
basis {(a1 , b1 ), ..., (an , bn )} such that
p
X
cos αk ak aTk + bk bTk + sin αk bk aTk − ak bTk .
(11) R=
k=1
Similarly to the rotation formulas (8) and (10) one can check that
(11) and (12) are orthogonal matrices with determinate one.
8 A. GOLDVARD AND L. KARP
Formulas (11) and (12) were derived by [5] but in a different way.
The advantage of the derivation given here is being constructive an
addition to being appropriate for an elementary linear algebra course.
Formulas (11) and (12) can be written in a vectors’ form
p
X
Rx = cos αk yk + sin αk zk
k=1
and
p
X
Rx = cos αk yk + sin αk zk + (cT x)c,
k=1
5. Concluding Remarks