0% found this document useful (0 votes)
25 views6 pages

L 17

The document defines and discusses inner product spaces. It defines what an inner product is and provides examples. It also defines related concepts like norms and orthogonal bases. Gram-Schmidt process for obtaining an orthonormal basis is described. Properties of orthogonal complements are also covered.

Uploaded by

ABHISHEK SHARMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views6 pages

L 17

The document defines and discusses inner product spaces. It defines what an inner product is and provides examples. It also defines related concepts like norms and orthogonal bases. Gram-Schmidt process for obtaining an orthonormal basis is described. Properties of orthogonal complements are also covered.

Uploaded by

ABHISHEK SHARMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

17.

Inner product spaces


Definition 17.1. Let V be a real vector space. An inner product on
V is a function
h , i : V × V −→ R,
which is
• symmetric, that is
hu, vi = hv, ui.
• bilinear, that is linear (in both factors):
hλu, vi = λhu, vi,
for all scalars λ and
hu1 + u2 , vi = hu1 , vi + hu2 , vi,
for all vectors u1 , u2 and v.
• positive that is
hv, vi ≥ 0.
• non-degenerate that is if
hu, vi = 0
for every v ∈ V then u = 0.
We say that V is a real inner product space. The associated
quadratic form is the function
Q : V −→ R,
defined by
Q(v) = hv, vi.
Example 17.2. Let A ∈ Mn,n (R) be a real matrix. We can define a
function
h , i : Rn × Rn −→ R,
by the rule
hu, vi = ut Av.
The basic rules of matrix multiplication imply that this function is bi-
linear. Note that the entries of A are given by
aij = eti Aej = hei , ej i.
In particular, it is symmetric if and only if A is symmetric that is
At = A. It is non-degenerate if and only if A is invertible, that is A
has rank n. Positivity is a little harder to characterise.
Perhaps the canonical example is to take A = In . InPthis case if
u = (r1 , r2 , . . . , rn ) and v = (s1 , s2 , . . . , sn ) then ut In v = ri si . Note
1
ri2 . The square root of this is the
P
that if we take u = v then we get
Euclidean distance.
Definition 17.3. Let V be a real vector space. A norm on V is a
function
k.k : V −→ R,
what has the following properties

kkvk = |k|kvk,
for all vectors v and scalars k.
• positive that is
kvk ≥ 0.
• non-degenerate that is if
kvk = 0
then v = 0.
• satisfies the triangle inequality, that is
ku + vk ≤ kuk + kvk.
Lemma 17.4. Let V be a real inner product space.
Then
k.k : V −→ R,
defined by p
kvk = hv, vi,
is a norm on V .
Proof. It is clear that the norm satisfies the first property and that it
is positive. Suppose that u ∈ V . By assumption there is a vector v
such that
hu, vi =
6 0.
Consider
0 ≤ hu + tv, u + tvi
= hu, ui + 2thu, vi + t2 hv, vi.
If hv, vi = 0 then certainly hu, ui > 0. Otherwise put
hu, vi
t=− > 0.
hv, vi
Then
hu, ui + 2thu, vi + t2 hv, vi = hu, ui + thu, vi.
2
Once again
(hu, vi)2
hu, ui > > 0.
hv, vi
Thus the norm is non-degenerate.
Now suppose that u and v ∈ V . Then
hu + v, u + vi = hu, ui + 2hu, vi + hv, vi
≤ kuk2 + 2kuk · kvk + kvk2
= (kuk + kvk)2 .
Taking square roots gives the triangle inequality. 
Note that one can recover the inner product from the norm, using
the formula
2hu, vi = Q(u + v) − Q(u) − Q(v),
where Q is the associated quadratic form. Note the annoying ap-
pearence of the factor of 2.
Notice also that on the way we proved:
Lemma 17.5 (Cauchy-Schwarz-Bunjakowski). Let V be a real inner
product space.
If u and v ∈ V then
hu, vi ≤ kuk · kvk.
Definition 17.6. Let V be a real vector space with an inner product.
We say that two vectors v and w are orthogonal if
hu, vi = 0.
We say that a basis v1 , v2 , . . . , vn is an orthogonal basis if the vectors
v1 , v2 , . . . , vn are pairwise orthogonal. If in addition the vectors vi have
length one, we say that v1 , v2 , . . . , vn is an orthonormal basis.
Lemma 17.7. Let V be a real inner product space.
(1) If the vectors v1 , v2 , . . . , vm are pairwise orthogonal then they
are independent. In particular if m = dim V then v1 , v2 , . . . , vm
are an orthogonal basis of V .
(2) If v1 , v2 , . . . , vn are an orthonormal basis of V and v ∈ V then
X
v= ri vi ,
where
ri = hv, vi i.
3
Proof. We first prove (1). Suppose that
r1 v1 + r2 v2 + · · · + rm vm = 0.
Taking the inner product of both sides with vj gives
0 = hr1 v1 + r2 v2 + · · · + rm vm , vj i
Xm
= ri hvi , vj i
i=1
= rj hvj , vj i.
As
hvj , vj i =
6 0,
it follows that rj = 0. This is (1).
The proof of (2) is similar. 
So how does one find an orthonormal basis?
Algorithm 17.8 (Gram-Schmidt). Let v1 , v2 , . . . , vn be independent
vectors in a real inner product space V .
(1) Let 1 ≤ k ≤ n be the largest index such that v1 , v2 , . . . , vk are
orthonormal.
(2) If k = m then stop.
(3) Otherwise let
uk+1 = vk+1 − r1 v1 − r2 v2 − · · · − rm vm ,
where ri = hvk+1 , vi i. Replace vk+1 by
uk+1
,
kuk+1 k
and return to (1).
In practice the algorithm works as follows. First we replace v1 by
v1
,
kv1 k
so that v1 has unit length. Then we consider v2 . We have to subtract
some of v1 to ensure that it is orthogonal to v1 . So consider a vector
of the form
u = v2 + λv1 ,
where λ is chosen to make u orthogonal to v1 . We have
0 = hu, v1 i = hv2 , v1 i + λhv1 , v1 i,
so that
λ = −hv2 , v1 i.
4
Then we rescale to get a vector of unit length. At the next stage, we
can choose λ and µ so that
v3 + λv1 + µv2 ,
is orthogonal v1 and v2 . The key thing is that since v1 and v2 are
orthogonal, our choice of λ and µ are independent of each other.
For example, consider the vectors
v1 = (1, −1, 1), v2 = (1, 0, 1) and v3 = (1, 1, 2),
in R3 with the usual inner product. The first step is to replace v1 by
1
v1 = √ (1, −1, 1).
3
Now let
2
u = (1, 0, 1) − (1, −1, 1)
3
1
= (1, 2, 1).
3
Then we replace u by a vector parallel to u of unit length
1
v2 = √ (1, 2, 1).
6
Finally we put
2 5
u = (1, 1, 2) − (1, −1, 1) − (1, 2, 1)
3 6
1
= (−1, 0, 1).
2
Finally we replace u by a vector of unit length,
1
v3 = √ (−1, 0, 1).
2
Thus
1 1 1
v1 = √ (1, −1, 1) v2 = √ (1, 2, 1) and v3 = √ (−1, 0, 1),
3 6 2
is the orthonormal basis produced by Gram-Schmidt.
One very useful property of inner products is that we get canonically
defined complimentary linear subspaces:
Lemma 17.9. Let V be a finite dimensional real inner product space.
If U ⊂ V is a linear subspace, then let
U ⊥ = { w ∈ V | hw, ui = 0, ∀u ∈ U },
the set of all vectors orthogonal to every element of U . Then
5
• U ⊥ is a linear subspace of V .
• U ∩ U ⊥ = {0}.
• U and U ⊥ span V .
In particular V is isomorphic to U ⊕ U ⊥ .
Proof. We first prove (1). Suppose that w1 and w2 ∈ U ⊥ . Pick u ∈ U .
Then
hw1 + w2 , ui = hw1 , ui + hw2 , ui
= 0 + 0 = 0.
It follows that w1 + w2 ∈ U ⊥ and so U ⊥ is closed under addition. Now
suppose that w ∈ U ⊥ and λ is a scalar. Then
hλw, ui = λhw, ui
= λ0 = 0.
Thus λw ∈ U ⊥ and so U ⊥ is closed under scalar multiplication. Thus
U ⊥ is a linear subspace of V . This is (1).
Suppose that w ∈ U ∩ U ⊥ . Then
hw, wi = 0.
But then w = 0. This is (2).
Suppose that v ∈ V . If v ∈ U there is nothing to prove. Otherwise
pick an orthonormal basis u1 , u2 , . . . , uk of U . Then the vectors v,
u1 , u2 , . . . , uk are independent. By Gram-Schmidt we may find scalars
r1 , r2 , . . . , rk such that
X
w=v− ri ui ,
is orthogonal to u1 , u2 , . . . , uk (in fact riP= hv, ui i). But then w is
orthogonal to U , that is w ∈ U ⊥ . Let u = ri ui ∈ U . Then v = u+w.
This is (3). 

You might also like