Lecture13 Aurora
Lecture13 Aurora
_
a
n
a
n1
.
.
.
a
1
a
0
_
_
For polynomials with real coecients and degree less than or equal to n, their seems to be an equivalence
to the space R
n+1
. The dierence lies in the interpretation of the vectors in this space.
For an example, use this vector representation, to represent this 3rd order polynomial, 2x
3
x
2
+ x, as an
element of the space of polynomials with degree at most 3.
answer: v =
_
_
2
1
1
0
_
_
Example 6, An example that is not a vector space:
So far almost everything we proposed has been a vector space. For illustration, we give an example that is
not a vector space. Consider the set of real 2D vectors; i.e., R
2
, but we will redene the operation of vector
addition, +.
+ : V V V, v +u =
_
v
1
v
2
_
+
_
u
1
u
2
_
_
min(v
1
, u
1
)
min(v
2
, u
2
)
_
Now we check to see if the properties of vector spaces still hold.
4
The set is closed under linear combination,
_
v
1
v
2
_
+
_
u
1
u
2
_
=
_
min(v
1
, u
1
)
min(v
2
, u
2
)
_
V.
Commutativity holds,
_
v
1
v
2
_
+
_
u
1
u
2
_
=
_
min(v
1
, u
1
)
min(v
2
, u
2
)
_
=
_
u
1
u
2
_
+
_
v
1
v
2
_
.
Associativity holds,
__
v
1
v
2
_
+
_
u
1
u
2
__
+
_
w
1
w
2
_
=
_
min(v
1
, u
1
)
min(v
2
, u
2
)
_
+
_
w
1
w
2
_
=
_
min(v
1
, u
1
, w
1
)
min(v
2
, u
2
, w
2
)
_
= v + (u +w).
Existence of a zero element fails! For any vector v =
_
v
1
v
2
_
R
2
, there must be an element z also in R
2
such
that v + z = v. The minimum of two numbers gives back the lower number, so this implies the entries of
z should be greater than all possible entries of v. But in the set of reals, there is no largest number. We
could think of choosing z =
_
_
. However this z is not R
2
, by virtue of the fact that / R. Hence this
property fails, as will the existence of additive inverses.
3 Basis and Dimension
A basis for a vector space V is a minimal set of vectors, {v
1
, . . . , v
r
}, for which all other vectors in the space
can be described as
r
i=1
c
i
v
i
, for some coecients c
i
F. The minimal number of vectors needed to form
a basis is called the dimension of the space.
In other words, a vector space V may be described by all possible linear combinations (the span) of its basis
vectors.
Example, A Basis for R
3
:
e
1
=
_
_
1
0
0
_
_
, e
2
=
_
_
0
1
0
_
_
, e
3
=
_
_
0
0
1
_
_
,
Can we reach all points in R
3
by linearly combining these vectors?
Take an arbitrary point,
_
_
a
b
c
_
_
, where a, b, c are any real values. We can express
_
_
a
b
c
_
_
= a
_
_
1
0
0
_
_
+ b
_
_
0
1
0
_
_
+ c
_
_
0
0
1
_
_
= ae
1
+ be
2
+ ce
3
Clearly, the span {e
1
, e
2
, e
3
} = R
3
. In fact, these particular vectors are called the standard basis for R
3
,
and have the nice property that they are orthogonal to each other and each have unit norm, forming an
orthonormal basis. We can also show this set is minimal; i.e., we cannot span R
3
with a smaller set of vectors.
To show why, we need to dene the concept of linear independence (next section).
Question: Is the choice of basis unique?
Answer: No, there are innitely many choices for bases for a vector space. For example, consider these
vectors as an alternative basis for R
3
.
v
1
=
_
_
1
1
1
_
_
, v
2
=
_
_
1
1
0
_
_
, v
3
=
_
_
1
0
0
_
_
,
5
Again we can represent an arbitrary point,
_
_
a
b
c
_
_
, as a linear combination of the vectors.
_
_
a
b
c
_
_
= c
_
_
1
1
1
_
_
+ (b c)
_
_
1
1
0
_
_
+ (a b)
_
_
1
0
0
_
_
Therefore this set of vectors spans all of R
3
. It is also a minimal set because all of the vectors {v
i
} are
linearly independent.
4 Linear Independence of Vectors
A set of vectors v
1
, . . . , v
m
is linearly independent if none of the vectors can be written as a linear
combination of the others. This means that for every k = 1, . . . , m, there is no set of coecients {c
i
} F
such that
v
k
=
i=k
c
i
v
i
This means that within the set {v
i
} no vector can be represented as a linear combination of the other vectors.
Now we can show why a basis must be comprised of linearly independent vectors.
Suppose we had a basis for V, a set of vectors v
1
, v
2
, . . . , v
m
. And suppose that there was a vector in the
set that is a linear combination of the others.
v
k
=
i=k
c
i
v
i
, for some {c
i
} F
This means that v
k
is in the span of the other vectors, v
k
span {v
i
: i [1, n], i = k}. Hence, any vectors
in V that are a combination of v
k
and the other vectors, could be written simply as a combination of the
other vectors. This contradicts the requirement that a basis be a minimal set. Therefore, to be a basis all
v
1
, v
2
, . . . , v
m
must be linearly independent.
Example:
Check whether the vectors from the basis example are linearly independent.
v
1
=
_
_
1
1
1
_
_
, v
2
=
_
_
1
1
0
_
_
, v
3
=
_
_
1
0
0
_
_
Can we write v
1
= v
2
+ v
3
, for some , R?
_
_
1
1
1
_
_
=
_
_
1
1
0
_
_
+
_
_
1
0
0
_
_
There is no way to get a 1 into the last coordinate so, v
1
is independent of {v
2
, v
3
}. We would also need to
check that v
2
is independent of {v
1
, v
3
}, and that v
3
is independent of {v
1
, v
2
}. However, there is a way
for us to check all combinations at once.
4.1 Checking if n vectors, in R
m
, are Linearly Independent
We have a set of vectors, { v
i
R
m
: i = 1, 2, . . . , n }. The set is linearly dependent if there is some k for
which
v
k
=
i=k
c
i
v
i
6
which is equivalent to
v
k
_
_
i=k
c
i
v
i
_
_
= 0
Multiplying this relation by a nonzero value, c
k
, we have
c
k
v
k
_
_
i=k
c
k
c
i
v
i
_
_
= 0
Dene a new set of coecients x
i
for i = 1, 2, . . . , n, where
x
i
=
_
c
k
for i = k
c
k
c
i
for i = k
so we have
x
k
v
k
_
_
i=k
x
i
v
i
_
_
=
_
n
i=1
x
i
v
i
_
= 0
Write this in matrix form.
_
_
v
1
v
2
v
n
_
_
_
_
x
1
x
2
.
.
.
x
n
_
_
= 0 (5)
A x = 0 (6)
where there is a requirement that at least one coecient of x that is nonzero. The set {v
i
} being lin-
early independent means that there is no x with at least one nonzero coordinate, such that the matrix
_
v
1
v
m
= A times x yields the zero vector. Since A0 = 0, there is always the solution x = 0, which
is called the trivial solution. We can conclude that the columns of A are linearly independent, when x = 0
is the only solution to Ax = 0, and conversely they are dependent if there are solutions other than x = 0.
We now use this method to check the example basis for linear independence.
v
1
=
_
_
1
1
1
_
_
, v
2
=
_
_
1
1
0
_
_
, v
3
=
_
_
1
0
0
_
_
so we look for x solving
_
_
1 1 1
1 1 0
1 0 0
_
_
_
_
x
1
x
2
x
3
_
_
= 0
The last row implies, x
1
= 0. The second row implies x
1
+ x
2
= 0, and since we know x
1
= 0, x
2
must
also be 0. The rst row says x
1
+ x
2
+ x
3
= 0, and so we conclude x
3
must also be 0. The only solution to
Ax = 0 is x = 0, the trivial solution, so these columns are linearly independent and therefore are minimal.
Since they are a minimal set of vectors spanning R
3
, they are indeed a basis for R
3
.
7
5 Finding a basis and properties of the basis
Example: Let V be the set of vectors,
V =
_
_
_
v R
3
: v =
_
_
1
1
0
_
_
+
_
_
1
0
0
_
_
, , R
_
_
_
(7)
One can check that this is a vector space, by going through the properties.
Lets give a basis for V. One possibility:
v
1
=
_
_
1
1
0
_
_
, v
2
=
_
_
1
0
0
_
_
(8)
We rst check that they span V. They do by the denition of V as being all linear combinations of these
vectors. Next we check if these vectors are linearly independent. For just two vectors this means there is no
c = 0, for which v
1
= cv
2
. The two vectors are linearly indendent, so this is a basis for V. Since there are
two vectors in the basis, the dimension of V is 2.
Notice that although the vectors within the space are 3-dimensional, the vector space itself is said to have
dimension 2, since that is the number of vectors needed to give a basis for the space. This V is called a
subspace of R
3
, since it has lower dimension than the space it is embedded in.
Can we nd another basis for this V? We notice that the space of vectors in V seem to be described by
vectors where the 3rd coordinate is zero. Therefore another choice of basis is
v
1
=
_
_
1
0
0
_
_
, v
2
=
_
_
0
1
0
_
_
(9)
This basis is special. It is called orthonormal, since i = j, v
T
i
v
j
= 0 and for every i, v
i
= 1. An
orthonormal basis is one where all basis vectors have unit norm and are orthogonal to each other. We will
see that these are very convenient bases.
Remark on dimension: Now let V be given by,
V =
_
_
_
v R
3
: v =
_
_
0
0
0
_
_
, R
_
_
_
(10)
The span of 0 is a set with the single element, namely {0}. It would seem by our denition of dimension
that this vector space V has dimension 1 because it can be described by all linear combination of the zero
vector. However, the convention is that this space has dimension 0. This makes sense because we think of a
plane having 2 dimensions and a line having one dimension, so a point should have 0 dimension.
6 Inner Product
An inner product is a mapping of two vectors to a scalar; i.e., < , > : V V F, which satises three
properties.
i. Positivity:
v V, < v, v > 0
and < v, v >= 0 v = 0
8
ii. Conjugate symmetry:
x, y V, < v, y >=< y, x >
iii. Linearity:
x, y, z V, < x +y, z > = < x, z > + < y, z >
x, y V, F, < x, y > = < x, y >
Example:
We consider inner product in R
2
.
< v, u > = v
T
u
First check that this is indeed a scalar.
v
T
u =
_
v
1
v
2
_
u
1
u
2
_
= v
1
u
1
+ v
2
u
2
We can additionally check for positivity,
< v, v > = v
T
v = v
2
1
+ v
2
2
which is greater than or equal to 0 for all real vectors, and can only be 0 when v = 0.
Conjugate symmetry,
< v, u > = v
T
u = v
1
u
1
+ v
2
u
2
= u
1
v
1
+ u
2
v
2
= < u, v >
In this case, since the eld is real, the inner product is symmetric.
Linearity,
< v, u > = (v)
T
u = < v, u >
< v +y, u > = (v +y)
T
u = v
T
u +y
T
u =< v, u > + < y, u >
9