Linear Algebra (MA - 102) Lecture - 11 Inner Product Spaces Gram-Schmidt Orthogonalization Process
Linear Algebra (MA - 102) Lecture - 11 Inner Product Spaces Gram-Schmidt Orthogonalization Process
Lecture - 11
Inner product spaces
Gram-Schmidt orthogonalization process
G. Arunkumar
May 5, 2022
Notation: ei denotes the vector in Rn with 1 in the i-th position and 0 elsewhere
(n is clear from the context)
"
t a
v=(a,b)
b)
do
.
'
hu, v i := a1 b1 + · · · + an bn
A vector space V with the inner product h, i is called as the inner product
space. We write this as (V , h, i) is an inner product space.
The dot product is a function V × V → R
The dot product has the following properties for every u, v ∈ Rn and c ∈ R:
1 hu, v i = hv , ui (Symmetry)
2 hu, v + w i = hu, v i + hu, w i (additivity)
3 hcu, v i = chu, v i (homogeneity),
4 hv , v i ≥ 0 with hv , v i = 0 ⇐⇒ v = 0 (positive definite).
hu, v i
5 If θ is the angle between u and v , then cos θ = hu, ui1/2 hv , v i1/2
6 Thus θ = π/2 iff hu, v i = 0
The notion of dot product helps us to analyse many geometric concepts.
G. Arunkumar, IIT Dharwad Linear Algebra: Introduction May 5, 2022 5 / 24
The Pythagoras Theorem
Definition
p a vector space. (1) Let v ∈ V . The length or norm of v is
Let V be
kv k = hv , v i and v is a unit vector if kv k = 1.
(2) Elements v , w of V are said to be orthogonal or perpendicular if hv , w i = 0.
We write this as v ⊥ w .
p p
Remark: If c ∈ R, v ∈ V then kcv k = hcv , cv i = c 2 hv , v i = |c|kv k.
Theorem
2 2 2
(Pythagoras Theorem) If v ⊥ w then kv + w k = kv k + kw k .
Proof: We have
2 2 2
kv + w k = hv + w , v + w i = hv , v i + hw , w i = kv k + kw k .
hw , v i
pw (v ) = w
hw , w i
Hi
P_w(v) is the point on Rw closest to v
Proposition
hw , v i hw , v i w w
pw (v ) = w= 2 w = h kw k , v i kw k = p kw k (v ).
w
hw , w i kw k
(3) We have
2
kv k = hv , v i
= hpw (v ) + v − pw (v ), pw (v ) + v − pw (v )i
2 2 2
= kpw (v )k + kv − pw (v )k ≥ kpw (v )k .
Theorem
(Cauchy-Schwartz (C-S) inequality) For v , w ∈ V
|hw , v i| ≤ kv kkw k,
Proof:
The result is clear if w = 0. So we may assume that w 6= 0.
Case (i): w is a unit vector. In this case the LHS of the C-S inequality is
kpw (v )k and the result follows from part (iii) of the previous lemma.
w
Case (ii): w is not a unit vector. Set u = kw k .
We have |hw , v i| = kw k(|hu, v i|) and kv kkw k = kw k(kv kkuk).
The result follows from Case (i).
hv , w i
−1 ≤ ≤ 1.
kv kkw k
Lemma
Let u, v , w ∈ V . Then
1 d(u, v ) ≥ 0 with equality iff u = v .
2 d(u, v ) = d(v , u).
3 d(u, v ) ≤ d(u, w ) + d(w , v ).
Proof. Exercise.
G. Arunkumar, IIT Dharwad Linear Algebra: Introduction May 5, 2022 11 / 24
Triangle inequality
Theorem
(Triangle Inequality) For v , w ∈ V
kv + w k ≤ kv k + kw k.
hv , w i
−1 ≤ ≤ 1.
kv kkw k
Lemma
Let u, v , w ∈ V . Then
1 d(u, v ) ≥ 0 with equality iff u = v .
2 d(u, v ) = d(v , u).
3 d(u, v ) ≤ d(u, w ) + d(w , u).
Proof. Exercise.
G. Arunkumar, IIT Dharwad Linear Algebra: Introduction May 5, 2022 13 / 24
Triangle inequality
Theorem
(Triangle Inequality) For v , w ∈ V
kv + w k ≤ kv k + kw k.
Definition
Let V be a vector space and S ⊆ V be a nonempty subset.
(1) We say that S is an orthogonal set if hx, y i = 0 for all x, y ∈ S with x 6= y .
(2) An orthogonal set is said to be orthonormal set if kv k = 1 for all v ∈ S.
(3) An orthogonal (resp. orthonormal) subset S of V is said to be an orthogonal
(resp. orthonormal) basis of V if L(S) = V .
Proposition
Let S = {u1 , . . . , un } be an orthogonal set of nonzero vectors in a vector space V .
Then S is linearly independent.
Proof.
1 Suppose c1 , c2 , . . . , cn are scalars with
c1 u1 + c2 u2 + . . . + cn un = 0.
Theorem
Let S := {v1 , . . . , vn } be a linearly independent subset of a vector space V . Then
w1 wn
there exists an orthogonal basis {w1 , . . . , wn } of L(S), and hence { kw 1k
, . . . , kw nk
}
is an orthonormal basis of L(S). In particular, every vector space has an
orthonormal basis.
Proof:
We define wi inductively
Set w1 := v1
Suppose w1 , . . . , wm are defined. To define wm+1 , take vm+1 and subtract
from it its projections along w1 , . . . , wm .
hw , v i
Recall that pw (v ) = hw , w i w .
1 0 2
2 Then w1 = v1 ,
4
hv2 , v1 i 1 −5
w2 = v2 − v1 =
hv1 , v1 i 3 0
1
Definition
Let W be a subspace of V . Define
W ⊥ = {u ∈ V | u ⊥ w for all w ∈ W }.
Theorem
Every v ∈ V can be written uniquely as v = x + y , where x ∈ W and y ∈ W ⊥ .
Moreover dim V = dim W + dim W ⊥ .