Linear Algebra Chapter 5.1
Linear Algebra Chapter 5.1
Linear Algebra Chapter 5.1
OTTO BRETSCHER
http://www.prenhall.com/bretscher
Chapter 5
Orthogonality and Least Squares
Chia-Hui Chang
Email: chia@csie.ncu.edu.tw
National Central University, Taiwan
vector ~v in Rn is k~v k = ~v ~v .
c. A vector ~
u in Rn is called a unit vector if its
length is 1, (i.e., k~
uk = 1, or ~
u~
u = 1).
Explanation:
If ~v is a nonzero vector in Rn, then
~
u = k~1vk~v
is a unit vector.
1
v~i v~j =
1 if i=j,
0 if i6=j.
Example. 1.
The vectors e~1,e~2,. . .,e~n in Rn are orthonormal.
Example. 2.
"
cos
sin
# "
sin
cos
are orthonormal.
v~1 =
1/2
1/2
1/2
1/2
, v~2 =
1/2
1/2
1/2
1/2
, v~3 =
1/2
1/2
1/2
1/2
Fact 5.1.3
a. Orthonormal vectors are linearly independent.
b. Orthonormal vectors v~1, . . ., v~n in Rn form
a basis of Rn.
Proof
a. Consider a relation
c1v~1+c2v~2+ +civ~i+ +cmv~m=~
0
Let us form the dot product of each side of
this equation with v~i:
(c1v~1 + c2v~2 + + civ~i + + cmv~m) v~i =
~
0 v~i = 0.
Because the dot product is distributive.
ci(~
vi v~i) = 0
Therefore, ci = 0 for all i = 1, . . ., m.
b. Any n linearly independent vectors in Rn
form a basis of Rn.
4
Orthogonal projections
See Figure 5.
The orthogonal projection of a vector ~
x onto
one-dimentaional subspace V with basis ~v1 (unit
vector) is computed by:
projV ~
x=w
~ = (v~1 ~
x)v~1
Now consider a subspace V with arbitrary dimension m. Suppose we have an orthonormal
basis ~v1, ~v2, . . . , ~vm of V . Find w
~ in V such that
~
xw
~ is in V . Let
w
~ = c1~v1 + c2~v2 + + cm~vm
It is required that
~ w
x
~ =~
x c1~v1 c2~v2 cm~vm
is perpendicular to V ; i.e.:
6
~ w)
~vi (x
~ = ~vi (~
x c1~v1 c2~v2 cm~vm)
= ~vi ~
x c1(~vi ~v1) ci(~vi ~vi) cm(~vi ~vm)
= ~vi ~
x ci = 0
The equation holds if ci = ~vi ~
x.
Therefore, there is a unique w
~ in V such that
~
xw
~ is in V , namely,
w
~ = (~v1 ~
x)~v1 + (~v2 ~
x)~v2 + + (~vm ~
x)~vm
Fact 5.1.6 Orthogonal projection
Consider a subspace V of Rn with orthonormal
basis v~1,v~2,. . .,v~m. For any vector ~
x in Rn, there
is a unique vector w
~ in V such that ~
x-w
~ is in
V . This vector w
~ is called the orthogonal
projection of ~
x onto V , denoted by projV ~
x. We
have the formula
projV ~
x=(v~1 ~
x)v~1+ +(v~m ~
x)v~m.
The transformation T (~
x) = projV ~
x from Rn to
Rn is linear.
7
Example. 4
Consider the subspace V =im(A) of R4. where
A=
1
1
1 1
.
1 1
1
1
Find projV ~
x, for
~
x=
1
3
1
7
Solution
The two columns of A form a basis of V . Since
they happen to be orthogonal, we can construct an orthonormal basis of V merely by dividing these two vectors by their length (2 for
both vectors):
8
v~1 =
1/2
1/2
1/2
1/2
, v~2 =
1/2
1/2
1/2
1/2
Then,
projV ~
x=(v~1 ~
x)v~1+(v~2 ~
x)v~2=6v~1+ 2v~2=
1
3
1
3
=
2
Example. 5
By using
~
x= 2 as a linear combination of
3
1 2 , v~ = 1
v~1 = 1
,
v
~
=
2
3
2
3
3
3
1 .
2
Solution
Since v~1,v~2,v~3 is an orthonormal basis of R3,
we have
~
x = (v~1 ~
x)v~1 + (v~2 ~
x)v~2 + (v~3 ~
x)v~3 = 3v~1 +
v~2 + 2v~3.
10
12
13
k ~
x k.
14
Example. 7
Find the angle between the vectors
1
0
y=
~
x = and ~
1
0
0
Solution
~
y
1 = 1
=
cos = k~x~xkk~
12
2
yk
16
Correlation
Consider two characteristics of a population,
with deviation vectors ~
x and ~
y . There is a
positive correlation between the two characteristics if (and only if) ~
x~
y > 0.
Definition. 5.1.12
Correlation coefficient
The correlation coefficient r between two characteristics of a population is the cosine of the
angle between the deviation vectors ~
x and ~
y
for the two characteristics:
~
y
r = cos() = k~x~xkk~
yk