Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

MATH 4A - Linear Algebra With Applications: Lecture 17: Subspaces From Linear Transformations/matrices, & Bases

Download as pdf or txt
Download as pdf or txt
You are on page 1of 35

Linear transformations Range/column space Kernels/null spaces Linear independence and bases

MATH 4A - Linear Algebra with Applications


Lecture 17: Subspaces from linear transformations/matrices, &
bases

8 May 2019

Reading: §4.2-4.4
Recommended problems from §4.2: 1, 5, 7, 9, 11, 13, 19, 23, 25,
26, 27, 35
Recommended problems from §4.3: 1, 3, 5, 7, 9, 11, 13, 15, 21,
22, 23, 33
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Lecture plan

1 Linear transformations

2 Range/column space

3 Kernels/null spaces

4 Linear independence and bases


Linear transformations Range/column space Kernels/null spaces Linear independence and bases

A not so new definition

We can generalize the definition of linear transformation Rn → Rm


by replacing Rn and Rm with abstract vector spaces:

Let T and W be two vector spaces. A linear transformation


T : V → W is a function that assigns to each vector x in V , a
unique vector T (x) in W such that
(i) T (u + v) = T (u) + T (v) for all u, v in V , and
(ii) T (cu) = cT (u) for all u in V and scalars c.
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Just as before

Many of the manipulations we performed on linear transformations


Rn → Rm work for abstract linear transformations V → W .

For example,the same argument as before shows that for any linear
transformation T : V → W

T (0) = 0,

that is, T takes the 0 vector of V to the 0 vector of W .

We can also talk about onto (for every w in W there exists v in V


such that T (v) = w) and one-to-one transformations (if
T (v) = T (u) then v = u).
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Example: derivatives

Recall (from calculus), that if f (t) and g (t) are two differentiable,
R-valued functions with domain R, then
d df dg
(f + g )(t) = (t) + (t)
dt dt dt
and
d df
(cf )(t) = c (t).
dx dt
In other words, the derivative is linear!
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

To be more precise, we first need a vector space


There is a vector space called the space of smooth functions on R,
which we denote C ∞ (R):
1 the vectors in C ∞ (R) are R-valued functions f (t) that are
infinitely differentiable, meaning
d nf
(t)
dt n
exists for all n = 0, 1, 2, 3, . . . .
2 The zero vector is the zero function 0(t) = 0.
3 Addition of vectors is usual addition of functions:

(f + g )(t) = f (t) + g (t)

and scalar multiplication is just

(cf )(t) = c · f (t).


Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Examples of vectors in C ∞ (R) are the functions:

f (t) = e 2t − sin t g (t) = 3t 3 − 2t + 1


(f + g )(t) = e 2t − sin t + 3t 3 − 2t + 1.

In fact, Pn is a subspace of C ∞ (R) for all n = 1, 2, 3, . . . .


Linear transformations Range/column space Kernels/null spaces Linear independence and bases

With these details, we can be more precise

The derivative is a linear transformation


D : C ∞ (R) → C ∞ (R)
df
f (t) 7→ (t).
dt

(Pretentious jargon: when a linear transformation acts on a vector


space of functions, we usually call it a linear operator.)
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

iClicker 1

Is the derivative map D : C ∞ (R) → C ∞ (R) an onto map?


(a) Yup
(b) Nope

(Hint: is every smooth function the derivative of some other


function?)
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

iClicker 1

Is the derivative map D : P3 → P3 an onto map?


(a) Yup
(b) Nope

(Hint: is every degree 3 polynomial the derivative of some other


degree 3 polynomial?)
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

iClicker 2

Is the derivative map D : P3 → P2 an onto map?


(a) Yup
(b) Nope

(Hint: is every degree 2 polynomial the derivative of some degree 3


polynomial?)
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

iClicker 3

Is the derivative map D : P3 → P3 a one-to-one map?


(a) Yup
(b) Nope

(Hint: can two degree 3 polynomials have the same derivative?)


Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Definition

Let T : V → W be a linear transformation between two vector


spaces. The range of T is the set of all vectors in W of the form
T (v) for some v in V .
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Theorem: the range of a linear transformation is a


subspace of its codomain
By definition, the range of T : V → W is a subset of W . We
should show it satisfies the three conditions necessary to be a
subspace:
1 Every linear transformation takes 0 to 0, so 0 is in the range.

2 If u and w are in the range of T , then there exist x and y in

V such that
T (x) = u and T (y) = v.
Since T is a linear transformation,
T (x + y) = T (x) + T (y) = u + v,
hence u + v is also in the ramge of T .
3 If u is in the range of T and c is a scalar, then there exists a
vector x in V such that T (x) = u. Since T is a linear
transformation
T (cx) = cT (x) = cu
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Example

For all n = 1, 2, 3, . . . , the range of D : Pn → Pn is Pn−1 . Why?

By the power rule for derivatives, if p(t) has degree at most n,


then dp
dt (t) has degree at most n − 1. Thus, the range of D is a
subspace of Pn−1 .

On the other hand, any polynomial q(t) of degree at most n − 1


has an antiderivative of degree at most n, so Pn−1 is a subspace of
the range of D.
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

How to understand the range in terms of matrices?

We know that every linear transformation T : Rn → Rm can be


represented by a matrix A. (In fact, we will generalize this in a
couple days to linear transformations between abstract vector
spaces V → W .) So how should we understand the range of T in
terms of A?

Well, given any vector v in the domain Rn , we know T (v) = Av is


a linear combination of the columns of A. Conversely, any linear
combination of the columns of A can be expressed as Av for some
v. Thus, the range of T is the span of the columns of A.

It couldn’t get much more explicit than that!


Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Definition

The column space of a m × n matrix A is the subspace of Rm


spanned by the columns of A. We denote it Col A.

We just showed that if A is the standard matrix for a linear


transformation T : Rn → Rm , then Col A is the same thing as the
range of T .
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Definition

Let T : V → W be a linear transformation between two vector


spaces. The kernel (or null space) of T is the set of all vectors u in
V such that T (u) = 0.

Informally: the kernel of T is the set of vectors in V that get


“killed” by T .
Linear transformations Range/column space Kernels/null spaces Linear independence and bases
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Theorem: the kernel of a linear transformation is a


subspace of its domain

By definition, the kernel of T : V → W is a subset of V . We


should show it satisfies the three conditions necessary to be a
subspace:
1 Every linear transformation takes 0 to 0, so 0 is in the kernel.
2 If u and w are in the kernel of T , then

T (u + v) = T (u) + T (v) = 0 + 0 = 0

hence u + v is also in the kernel of T .


3 If u is in the kernel of T and c is a scalar, then

T (cu) = cT (u) = c0 = 0

hence cu is also in the kernel of T .


Linear transformations Range/column space Kernels/null spaces Linear independence and bases

iClicker 5

Does the derivative map D : P3 → P3 have any nonzero vectors in


its kernel?
(a) Mhm
(b) Naw

(Hint: can a nonzero polynomial have a derivative that is 0?)


Linear transformations Range/column space Kernels/null spaces Linear independence and bases

How to interpret kernels in terms of matrices?

Well, we want the set of all vectors u in Rn such that

T (u) = Au = 0.

In other words, the kernel/null space of T is the set of solutions to


the homogeneous linear system Ax = 0.
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

So let’s extend our definition to matrices

If A is any m × n matrix, then the null space/kernel of A is the


subspace of Rn consisting of solutions to the homogeneous
equation Ax = 0. In set notation:

Nul A = Ker A = {x : x is in Rn and Ax = 0}.


Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Now that we have an abstract definition, how do we


compute a description of the null space of a matrix
explicitly?

Row reduction!

Nul A is the set of solutions to the homogeneous system Ax = 0.


We already know how to solve this:
1 Form the augmented matrix (A 0)
2 Apply the algorithm to put it in reduced echelon form
3 Solve for a parametric vector form of the general solution.
This allows us to write Nul A as the span of a set of linearly
independent vectors (the number of vectors will be the
number of free variables).
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Example

When we augment the matrix


 
1 0 1 1
A =  0 1 0 2
−1 0 1 4

with a 0 column, the result has reduced echelon form

1 0 0 − 23 0
 

E = 0 1 0 2 0 
0 0 1 52 0
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

...so a general solution looks like


   3   3 
x1 2 x4 2
x2   −2x4   −2 
x =   =  5  = x4  5 
     = x4 u
x3 − 2 x4 −2
x4 x4 1

where u = ( 23 , −2, − 52 , 1).

We conclude that
Nul A = Span{u}.
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Comparing null and column spaces

Let A be a m × n matrix. The null and column spaces of A are


very different! (They don’t even live in the same place!)
Nul A Col A
subspace of domain Rn subspace of codomain Rm
implicitly defined explicitly defined
(must row reduce (A 0) to find example vectors) (just take span to find example vectors)
easy to tell if u is in Nul A requires work to tell if w is in Col A
(just compute Au) (must row reduce (A w))
Nul A = {0} if and only if Col A = Rm if and only if
A is one-to-one A is onto

See table in book for more comparison.


Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Lingering issue

At the moment, our concrete understanding of images and kernels


(among other things!) in terms of matrices only works for linear
transformations Rn → Rm .

We would like to find similarly concrete description for linear


transformations between abstract vector spaces V → W . To this
end, in the next few days we will figure out how to associate
matrices to abstract linear transformations. First, we need to
discuss bases of vector spaces, which generalize the standard basis
vectors of Rn
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Yet another not so new definition

A set of vectors {v1 , v2 , . . . , vp } in a vector space V is linearly


independent if the vector equation

c1 v1 + c2 v2 + · · · + cp vp = 0

has only the trivial solution.

In the case V = Rn , this is the exact same definition from before.


We can similarly generalize the definition of linearly dependent and
linear dependence relation from Rn to abstract vector spaces V .
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Recall

The standard basis of Rn is the set of n vectors {e1 , e2 , . . . , en }


where      
1 0 0
0 1 0
     
e1 =   , e2 =   · · · en = 0 .
0 0  
. . .
 ..   ..   .. 
0 , 1
This set of vectors has two properties that work especially nicely
together:
1 The vectors span Rn .
2 The vectors are linearly independent.
The first condition says we can generate any vector v in Rn as a
linear combination of the ei , and the second condition implies that
there is exactly one way to write v as such a linear combination.
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Key Idea: any set of vectors in a vector space V with these


two properties is “as good as” the standard basis of Rn

In the next few lectures, we will use such sets of vectors to put
“coordinates” on V , allowing us to identify V with Rn . Then we
can use all of the matrix algebra we know and love to solve
problems in V .
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Definition

Let H be a subspace of a vector space V . A basis of H is a set of


vectors B = {b1 , b2 , . . . , bp } in V such that
(i) B is a linearly independent set, and
(ii) H = Span B = Span{b1 , b2 , . . . , bp }.
(Note: condition (ii) forces B to be a subset of H, not just a
subset of V .)
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Examples

Note that if V is a vector space, H = V is always a subspace of V .


So it makes sense to talk about a basis of V .

The motivating example is the standard basis of Rn . It is clearly a


basis.

But Rn has lots of other bases. If we choose any n vectors


“randomly” (so that we avoid “probability 0 events”) then they
almost surely span Rn . E.g. consider the set of vectors
     
 8 1 −2 
B=  −1 , −4 , 1  .
   
−1 0 0
 
Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Examples

There are many ways we could use to determine if B is a basis.


Here’s one way: form the matrix
 
8 −1 −1
A =  1 −4 0 
−2 1 0

and compute its determinant:

det A = 7.

Thus A is invertible, which tells us its column vectors are linearly


independent (because A is one-to-one) and span R3 (because A is
onto).

We conclude B is a basis for R3 .


Linear transformations Range/column space Kernels/null spaces Linear independence and bases

Examples

Let S = {1, t, t 2 , . . . , t n }. I claim S is a basis for Pn . Why?

If p(t) = c0 + c1 t + c2 t 2 + · · · + cn t n , then clearly

p(t) = c0 · 1 + c1 · t + c2 · t 2 + · · · + cn · t n ,

which shows S spans Pn .

To see S is linearly independent, suppose some linear combination


of the vectors in S satisfies

c0 · 1 + c1 · t + c2 · t 2 + · · · + cn · t n = 0.

From algebra, we know that the only way this is possible is if


c0 = c1 = · · · = cn = 0, which means the only linear dependence
relation on S is the trivial one. Thus S is linearly independent.

You might also like