Chapter 3
Chapter 3
Chapter 3
Vector Spaces
Chapter 3 Vector Spaces
Section 3.1
Euclidean n-Space
Euclid of Alexandria (about 320-260 B.C.)
Euclid of Alexandria is a
famous mathematician of
antiquity best known for his
treatise on mathematics, The
Elements, a textbook on
plane geometry that
summarized the works of the
Golden Age of Greek
Mathematics. However little
is known of Euclid’s life
except that he taught at
Alexandria in Egypt.
Geometric vectors (Discussion 3.1.1)
u = v, u ≠ w and u ≠ x
u v w x
Geometric vectors (Discussion 3.1.1)
u
v
u
v u+v
=v+u
Note that u + v = v + u. v
u
Geometric vectors (Discussion 3.1.1)
u −u
u
v
u
u−v
Note that u − v = u + (−v). −v
Geometric vectors (Discussion 3.1.1)
u 1 (−1.5)u
u 2u
2
If c is positive, the vector cu has the same direction
as u and its length is c times of the length of u.
If c is negative, the vector cu has the reverse
direction of u and its length is |c| times of the length
of u.
0u = 0 is the zero vector.
(−1)u = −u is the negative of u.
Coordinate systems: xy-plane (Discussion 3.1.2.1)
u v1 u2 + v2
u2
(0, 0) x
u1
u1 + v1 So u + v = (u1 + v1, u2 + v2).
Coordinate systems: xy-plane (Discussion 3.1.2.1)
u cu2
u2
(0, 0) x
u1
cu1 So cu = (cu1, cu2).
Coordinate systems: xyz-space (Discussion 3.1.2.2)
x
Coordinate systems: xyz-space (Discussion 3.1.2.2)
{ (x, y) | ax + by = c }
where a, b, c are real constants and a, b are not both
zero.
Explicitly, the line can also be expressed as
c − bt
a
, t t∈ if a ≠ 0;
or c − at
t, b
t∈ if b ≠ 0.
Planes in 3 (Example 3.1.8.3 (b))
d − as − ct
s, b
, t s, t ∈ if b ≠ 0;
d − as − bt
or s, t, s, t ∈ if c ≠ 0.
c
Lines in 3 (Example 3.1.8.3 (c))
the line
(a0, b0, c0) + t (a, b, c)
(a0, b0, c0)
(a, b, c)
Section 3.2
Linear Combinations and
Linear Spans
Linear combinations (Definition 3.2.1)
For any real numbers c1, c2, ..., ck, the vector
c1u1 + c2u2 + ··· + ckuk
is called a linear combination of u1, u2, ..., uk.
Examples (Example 3.2.2.1 (a))
2 1 3 3 2 1 3 3
Gaussian
1 −1 0 3 0 −3/2 −3/2 3/2
Elimination
3 2 5 4 0 0 0 0
2 1 3 1 2 1 3 1
Gaussian
1 −1 0 2 0 −3/2 −3/2 3/2
Elimination
3 2 5 4 0 0 0 3
Let V = { (2a + b, a, 3b − a) | a, b ∈ }⊆ 3.
For any a, b ∈ ,
(2a + b, a, 3b − a) = a(2, 1, −1) + b(1, 0, 3).
So V = span{ (2, 1, −1), (1, 0, 3) }.
Examples (Example 3.2.4.4)
1 1 0 x 1 1 0 x
Gaussian
0 1 1 y 0 1 1 y
Elimination
1 0 1 z 0 0 2 z−x+y
Show that
span{ (1, 1, 1), (1, 2, 0), (2, 1, 3), (2, 3, 1) } ≠ 3.
1 1 2 2 x 1 1 2 2 x
Gaussian
1 2 1 3 y 0 1 −1 1 y−x
Elimination
1 0 3 1 z 0 0 0 0 z + x − 2y
u1 u2 uk
a11 a21 ··· ak1
a12 a22 ··· ak2
Let A = .
⁞ ⁞ ⁞
a1n a2n ··· akn
1 1 0 1 1 0
Gaussian
0 1 1 0 1 1
Elimination
1 0 1 0 0 2
So span{ (1, 0, 1), (1, 1, 0), (0, 1, 1) } = 3.
1 1 2 2 1 1 2 2
Gaussian
1 2 1 3 0 1 −1 1
Elimination
1 0 3 1 0 0 0 0
So span{ (1, 1, 1), (1, 2, 0), (2, 1, 3), (2, 3, 1) } ≠ 3.
When span(S) = n (Theorem 3.2.7 & Example 3.2.8)
1. 0 ∈ span(S).
2. For any v1, v2, ..., vr ∈ span(S)
and c1, c2, ..., cr ∈ ,
c1v1 + c2v2 + ··· + crvr ∈ span(S).
Proof:
1. 0 = 0u1 + 0u2 + ··· + 0uk ∈ span(S).
2. Write v1 = a11u1 + a12u2 + ··· + a1kuk ,
v2 = a21u1 + a22u2 + ··· + a2kuk ,
⁞
vr = ar 1u1 + ar 2u2 + ··· + ar kuk .
Some basic results (Theorem 3.2.9)
Similarly,
u2 is a linear combination of v1, v2
a + 2b = 1
⇔ the linear system 2a − b = 1 is consistent;
3a + b = 2
and
u3 is a linear combination of v1, v2
a + 2b = −1
⇔ the linear system 2a − b = 2 is consistent.
3a + b = 1
Examples (Example 3.2.11.1)
a + 2b = 1 a + 2b = 1 a + 2b = −1
2a − b = 0 2a − b = 1 2a − b = 2
3a + b = 1 3a + b = 2 3a + b = 1
The row operations required to solve the three systems
are the same, we can work them out together:
1 2 1 1 −1 1 2 1 1 −1
Gaussian
2 −1 0 1 2 0 −5 −2 −1 4
Elimination
3 1 1 2 1 0 0 0 0 0
The three systems are consistent, i.e. all ui are linear
combinations of v1 and v2 .
So span{ u1, u2, u3 } ⊆ span{ v1, v2 }.
Examples (Theorem 3.2.11.1)
1 1 −1 1 2 1 1 −1 1 2
Gaussian
0 1 2 2 −1 0 1 2 2 −1
Elimination
1 2 1 3 1 0 0 0 0 0
Examples (Example 3.2.11.1)
1 −1 −1 1 0 2 1 −1 −1 1 0 2
1 1 1 0 1 1 Gaussian 0 2 2 −1 1 −1
1 −1 1 0 −1 −1 Elimination 0 0 2 −1 −1 −3
1 1 −1 1 2 4 0 0 0 0 0 0
1 0 2 1 −1 −1 1 0 2 1 −1 −1
0 1 1 1 1 1 Gaussian 0 1 1 1 1 1
0 −1 −1 1 −1 1 Elimination 0 0 0 2 0 2
1 2 4 1 1 −1 0 0 0 0 0 0
the origin
In 2, if u = (u1, u2), then
span{ u } = { (cu1, cu2) | c ∈ } = { (x, y) | u2x − u1y = 0 }.
v
u
the origin
span(u, v)
Geometrical interpretation (Discussion 3.2.14.2)
Let L be a line in 2 or 3.
Pick a point x on L
(where x is regarded as a vector joining the origin to the point)
and a nonzero vector u such that span{ u } is a line L0
through the origin and parallel to L.
Explicitly, x + tu
L = { x + w | w ∈ L0 } L x
= { x + w | w ∈ span{ u } }
tu
= { x + tu | t ∈ }. u
the origin
L0 = span{ u }
Planes in 3 (Discussion 3.2.15.2)
Let P be a plane in 3.
Pick a point x on P
(where x is regarded as a vector joining the origin to the point)
and two nonzero vectors u and v such that span{ u, v }
is a plane P0 containing the origin and parallel to P.
Explicitly,
P
P = { x + w | w ∈ P0 } x
= { x + w | w ∈ span{ u, v } } x + su + tv
= { x + su + tv | s, t ∈ }.
the origin u
v su + tv
P0 = span{ u, v }
Lines and planes in n (Discussion 3.2.15)
Section 3.3
Subspaces
Subspaces (Discussion 3.3.1)
Let V be a subset of n.
V is called a subspace of n
if V = span(S) where S = { u1, u2, ..., uk } for some
vectors u1, u2, ..., uk ∈ n.
More precisely, we say that
V is a subspace spanned by S;
or V is a subspace spanned by u1, u2, ..., uk.
We also say that
S spans V.
or u1, u2, ..., uk spans V.
Trivial subspaces (Remark 3.3.3.1-2)
Let V1 = { (a + 4b, a) | a, b ∈ }⊆ 2.
Let V2 = { (x, y, z) | x + y − z = 0 } ⊆ 3.
Let V3 = { (1, a) | a ∈ }⊆ 2.
Let V4 = { (x, y, z) | x2 ≤ y2 ≤ z2 } ⊆ 3.
Section 3.4
Linear Independence
Redundant vectors (Discussion 3.4.1)
u3 is a redundant
vector.
Let S = { u } ⊆ n.
Let S = { u, v } ⊆ n.
S1 is linearly dependent.
(The equation c1(1, 0) + c2(0, 4) + c3(2, 4) = (0, 0) has non-trivial
solution.)
S2 is linearly independent.
(The equation c1(−1, 0, 0) + c2(0, 3, 0) + c3(0, 0, 7) = (0, 0, 0)
has only the trivial solution.)
In particular,
1. In 2, a set of three or more vectors must be linearly
dependent;
2. In 3, a set of four or more vectors must be linearly
dependent.
Proof of the theorem (Theorem 3.4.7)
y y y
v
v
u u
u
x x x
v
P w
w v
u w
P'
v
the origin u the origin
v the origin u
Section 3.5
Bases
Vector spaces and subspaces (Discussion 3.5.1)
(b) 1 2 3 1 2 3
Gaussian There is no
2 9 3 0 5 −3 zero rows.
Elimination
1 0 4 0 0 −1/5
Thus (by Discussion 3.2.5) span(S) = 3.
Solution: Solving
a(1, 2, 1) + b(2, 9, 0) + c(3, 3, 4) = (5, −1, 9),
we obtain only one solution a = 1, b = −1, c = 2,
i.e. v = (1, 2, 1) − (2, 9, 0) + 2(3, 3, 4).
The coordinate vector of v relative to S is
(v)S = (1, −1, 2).
Examples (Example 3.5.9.1 (b))
Let v = (2, 3) ∈ 2
and S1 = { (1, 0), (0, 1) }.
Since
(2, 3) = 2(1, 0) + 3(0, 1), v
(v)S1 = (2, 3).
(0, 1)
(1, 0)
Examples (Example 3.5.9.2 (b))
Let v = (2, 3) ∈ 2
and S2 = { (1, −1), (1, 1) }.
Since
1 5 v
(2, 3) = − 2 (1, −1) + 2 (1, 1),
1 5
(v)S2 = − ,
2 2
.
(1, 1)
(1, −1)
Examples (Example 3.5.9.2 (c))
Let v = (2, 3) ∈ 2
and S3 = { (1, 0), (1, 1) }.
Since
(2, 3) = −(1, 0) + 3(1, 1), v
(v)S3 = (−1, 3).
(1, 1)
(1, 0)
Standard basis for n (Example 3.5.9.3)
Section 3.6
Dimensions
Size of bases (Theorem 3.6.1 & Remark 3.6.2)
This means that every basis for V has the same size k.
Proof of the theorem (Theorem 3.6.1)
has n vectors.)
Except { 0 } and 2, subspaces of 2 are lines through
the origin which are of dimension 1.
Except { 0 } and 3, subspaces of 3 are either
lines through the origin, which are of dimension 1, or
planes containing the origin, which are of dimension 2.
An example (Example 3.6.4.4)
non-pivot columns
Solution spaces (Discussion 3.6.5)
2 2 −1 0 1 0 1 1 0 0 1 0
−1 −1 2 −3 1 0 Gauss-Jordan 0 0 1 0 1 0
0 0 1 1 1 0 Elimination 0 0 0 1 0 0
1 1 −2 0 −1 0 0 0 0 0 0 0
An example (Example 3.6.6)
Solution:
c1u1 + c2u2 + c3u3 = 0
⇒ c1(2, 0, −1) + c2(4, 0, 7) + c3(−1, 1, 4) = (0, 0, 0)
⇒ c1 = 0, c2 = 0, c3 = 0.
So { u1, u2, u3 } is linearly independent.
Since dim( 3) = 3, (by Theorem 3.6.7) { u1, u2, u3 } is a
basis for 3.
Dimensions of subspaces (Theorem 3.6.9)
V V
U
U
the origin the origin
Invertible matrices (Theorem 3.6.11)
To prove 7 ⇔ 1:
Let A = a1 a2 ··· an where ai is the i th column of A.
1 2 1
Section 3.7
Transition Matrices
Coordinate vectors (Notation 3.7.1)
S = { u1, u2, u3 },
where u1 = (1, 0, −1), u2 = (0, −1, 0), u3 = (1, 0, 2),
T = { v1, v2, v3 },
where v1 = (1, 1, 1), v2 = (1, 1, 0), v3 = (−1, 0, 0).
Let w such that (w)S = (2, −1, 2). Find (w)T.
−1 0 2 2 2
[w ]T = P [w ]S = 1 −1 −2 −1 = −1
−1 −1 −1 2 −3
So (w)T = (2, −1, −3).
Examples (Example 3.7.4.2)
0
⁞
An observation:
0
the i th
Given a matrix A = (ai j )m × n and let ei = 1 . entry
0
0 ⁞
a11 ··· a1, i −1 a1i a1, i +1 ··· a1n ⁞ 0
a21 ··· a2, i −1 a2i a2, i +1 ··· a2n 0
Aei = 1
⁞ ⁞ ⁞ ⁞ 0
am1 ··· am, i −1 am i am, i +1 ··· amn ⁞
0
a111i 0 + ··· + a1, i −10 + a1i 1 + a1, i +10 + ··· + a1n0
a212i 0 + ··· + ath2, i −10 + a2i 1 + a2, i +10 + ··· + a2n0
= = the i column of A.
⁞ ⁞
am1m i 0 + ··· + am, i −10 + ami 1 + am, i +10 + ··· + amn0
Transition matrices (Theorem 3.7.5)