Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

SampleProofs PartialSolutions

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

MATH 2300 Sample Proofs

This document contains a number of theorems, the proofs of which are at a difficulty level
where they could be put on a test or exam. This should not be taken as an indication that
the only theorems on tests or exams will be taken from this document, nor that every (or
any) theorem in this document need be tested. This document is for information purposes
only. Questions marked with a * are a little harder than the others, and the more stars, the
harder the question, but all are testable. More solutions will be provided as time allows.

VECTOR SPACE PROOFS

1. Prove that for any set of vectors S = {v1 , . . . , vn } in a vector space V , span(S) is a
subspace of V .

Solution: Let u, w ∈ span(S), k ∈ R. Then there exist c1 , . . . cn ∈ R and k1 , . . . , kn ∈


R such that
u = c1 v1 + c2 v2 + . . . + cn vn
and
w = k1 v1 + k2 v2 + . . . + kn vn .

A1)

u + w = (c1 v1 + c2 v2 + . . . + cn vn ) + (k1 v1 + k2 v2 + . . . + kn vn )
= (c1 + k1 )v1 + (c2 + k2 )v2 + . . . + (cn + kn )vn ∈ span(S).

Thus span(S) is closed under addition.


M1)

ku = k(c1 v1 + c2 v2 + . . . + cn vn )
= k(c1 v1 ) + k(c2 v2 ) + . . . + k(cn vn )
= (kc1 )v1 + (kc2 )v2 + . . . + (kcn )vn ∈ span(S).

Thus span(S) is closed under scalar multiplication.


Thus by the subspace theorem, span(S) is a subspace of V .

2. Prove that if S is a linearly independent set of vectors, then S is a basis for span(S).

Solution: To be a basis for span(S), it must be linearly independent and span the
space. Certainly the span of S is equal to the entire space, span(S) (by definition),
and S is given to be linearly independent in the question. Thus S is a basis for
span(S).

R. Borgersen
3. Show that if A is an m × n matrix, then the solution set V to the equation Ax = 0 is a
subspace of Rn .

Solution: A1) Let x1 , x2 ∈ Rn be two solutions to the equation Ax = 0 (that is,


x1 , x2 ∈ V ). Then x1 + x2 ∈ Rn , and

A(x1 + x2 ) = Ax1 + Ax2


=0+0
= 0.

Thus x1 + x2 ∈ V .
M1). Let x1 ∈ V , k ∈ R. Then kx1 ∈ Rn , and

A(kx1 ) = k(Ax1 )
= k(0) = 0.

Thus kx1 ∈ V as well.


Thus by the subspace theorem, V is a subspace of Rn .

4. Prove that any finite set of vectors containing the zero vector is linearly dependent.

Solution: Let S = {0, v1 , . . . , vn }. Then the equation

c1 0 + c2 v1 + . . . + cn vn = 0

has the solution c1 = 1, c2 = c3 = · · · = cn = 0, which is not all zeros, and thus the
set S is linearly dependent.

5. Prove that if S = {v1 , v2 , . . . , vn } is a basis for a vector space V , then every vector
v ∈ V can be expressed as a linear combination of the elements of S in exactly one way.

Solution: Let v ∈ V . Let c1 , . . . , cn , k1 , . . . , kn ∈ R be such that

v = c1 v1 + c2 v2 + . . . + cn vn

and
v = k1 v1 + k2 v2 + . . . + kn vn .
Then subtracting straight down we get

0 = v − v = (c1 − k1 )v1 + (c2 − k2 )v2 + . . . + (cn − kn )vn .

R. Borgersen
But since S is a basis, S is linearly independent, and thus the only way this could
happen is if c1 − k1 = 0, ..., cn − kn = 0. Thus c1 = k1 , . . . , cn = kn and so v can
only be written as a linear combination of the elements of S in one way.


6. Given that {u, v, w} is a linearly independent set of vectors in some vector space V ,
prove that:
(a) the set {u, v} is linearly independent.
(b) the set {u, u + v} is linearly independent.
(c) the set {u + v, v + w} is linearly independent.

7. Let u, v ∈ R3 be such that u • v = 0. Prove that {u, v} is a linearly independent set.

8. Let V be a vector space. Prove that for every u ∈ V , 0 · u = 0.

Solution: Let u ∈ V . Then

0 · u = (0 + 0) · u (since 0 + 0 = 0)
= 0 · u ⊕ 0 · u. (axiom M 3)

By axiom A5, there exists −(0 · u). Then,

0 · u ⊕ −(0 · u) = 0 (by A5),

and

0 · u ⊕ −(0 · u) = (0 · u ⊕ 0 · u) ⊕ −(0 · u) (by above)


= 0 · u ⊕ (0 · u ⊕ −(0 · u)) (by A3)
=0·u⊕0 (by A5)
= 0 · u. (by A4)

Therefore
0 · u = 0.


9. Let V be a vector space. Prove that for every k ∈ R, k · 0 = 0.

Solution: Let k ∈ R. Then

k · 0 = k · (0 ⊕ 0) (by A4)
=k·0⊕k·0 (by M 2).

R. Borgersen
By A5, there exists −(k · 0). Then,

k · 0 ⊕ −(k · 0) = 0, (by A5)

but

k · 0 ⊕ −(k · 0) = (k · 0 ⊕ k · 0) ⊕ −(k · 0) (by above)


= k · 0 ⊕ (k · 0 ⊕ −(k · 0)) (by A3)
=k·0⊕0 (by A5)
= k · 0. (by A4)

Therefore
k · 0 = 0.


10. Let V be a vector space. Prove that for every u ∈ V , (−1) · u = −u.

Solution: Let u ∈ V . Then,

(−1) · u ⊕ u = (−1) · u ⊕ 1 · u (by M 5)


= (−1 + 1) · u (by M 3)
=0·u
=0 (by part 1).

Therefore, −u = (−1) · u.


11. Let V be a vector space. Prove that if for some k ∈ R and u ∈ V , k · u = 0, then
either k = 0, or u = 0.

Solution: Let k ∈ R, u ∈ V , and assume that k · u = 0. If k = 0, then this is true,


by part 1, so assume that k ̸= 0. Then
1 1
· (k · u = ( k) · u) (by M 4)
k k
=1·u
=u (by M 5),

but
1 1
· (k · u) = · 0 (by assumption)
k k
=0 (by part 2).

Therefore u = 0.

R. Borgersen

12. Prove that a set of vectors is linearly dependent if and only if at least one vector in
the set is a linear combination of the others.

13. Let A be a m×n matrix. Prove that if both the set of rows of A and the set of columns
of A form linearly independent sets, then A must be square.

Solution: Let r1 , . . . , rm ∈ Rn be the rows of A and let c1 , . . . , cn ∈ Rm be the


columns of A. Since the set of rows is linearly independent, and the rows are ele-
ments of Rn , it must be that m ≤ n. Similarly, since the set of columns is linearly
independent, and the columns are elements of Rm , it must be that n ≤ m. Thus
m = n.

∗∗
14. Let V be the set of 2 × 2 matrices, together with the operation ⊕ defined for any 2 × 2
matrices A and B as
A ⊕ B = AB (the usual matrix multiplication),
and with the standard scalar multiplication for matrices.
(a) Show that the vector space axiom A4 holds.
(b) Prove that V is not a vector space.
∗∗
15. Let
V = {(a, b) ∈ R2 : a > 0, b > 0}
together with the operations defined as follows: for (a, b), (c, d) ∈ V , k ∈ R,
(a, b) ⊕ (c, d) = (ac, bd)
k · (a, b) = (ak , bk ).
(a) Show that the vector space axiom M 3 holds in this space.
(b) Does the axiom A4 hold in this space? If so, find the zero vector and prove it is the
zero vector. If not, show that there is no possible zero vector.
∗∗
16. Let V be a vector space, and let W1 and W2 be subspaces of V . Prove that the set
U = {v : v ∈ W1 and v ∈ W2 }
(that is, U is the set of vectors in BOTH W1 and W2 ). Prove that U is a subspace of V
as well.

Solution: A1) Let u, v ∈ U . Then u, v ∈ W1 and u, v ∈ W2 . Then since A1 holds


in W1 , u + v ∈ W1 , and since A1 holds in W2 , u + v ∈ W2 as well. Thus u + v ∈ U
and so A1 holds.
M1) Let u ∈ U , and let k ∈ R. Then u ∈ W1 and u ∈ W2 , and so since M 1 holds
in W1 , ku ∈ W1 , and since M 1 holds in W2 , ku ∈ W2 as well. Thus ku ∈ U , and so
M 1 holds.
Thus, but the subspace theorem, U is a subspace of V .

R. Borgersen
∗∗
17. Let W be a subspace of a vector space V , and let v1 , v2 , v3 ∈ W . Prove then that
every linear combination of these vectors is also in W .

Solution: Let c1 v1 + c2 v2 + c3 v3 be a linear combination of v1 , v2 , v3 . Since W is


a subspace (and thus a vector space), since W is closed under scalar multiplication
(M 1), we know that c1 v1 , c2 v2 , and c3 v3 are all in W as well. Then since W is closed
under addition (A1), we know that c1 v1 + c2 v2 is also in W . Then applying closure
under addition (A1) again, we get that

(c1 v1 + c2 v2 ) + c3 v3 = c1 v1 + c2 v2 + c3 v3 ∈ W.

∗∗
18. Let S = {v1 , . . . , vr } be a set of vectors in Rn . If r > n, then S is linearly dependent.

Solution: Assume r > n, and assume that for each i, 1 ≤ i ≤ r,

vi = (vi,1 , vi,2 , . . . , vi,n ).

Let c1 , c2 , . . . , cr ∈ R be such that

c1 v1 + . . . + cr vr = 0.

This produces the homogeneous system of equations:


    
v1,1 v2,1 · · · vr,1 c1 0
 v1,2 v2,2 · · · vr,2   c2   0 
    
 .. .. ..   ..  =  .. 
 . . .   .   . 
v1,n v2,n · · · vr,n cr 0

The coefficient matrix of this system has n rows and r columns. But r > n. Therefore
this system is guaranteed to have a parameter, and since it is a homogeneous system
and has at least one solution, it therefore has infinitely many solutions. Thus there
is at least one solution for the ci ’s that is not all zero, and so the set S is linearly
dependent.

R. Borgersen
LINEAR TRANSFORMATION PROOFS

19. Prove that the range of a linear transformation T : V → W is a subspace of W .

20. Prove that given two linear transformations T1 : U → V and T2 : V → W , the composi-
tion T2 ◦ T1 : U → W is also a linear transformation.

21. Prove that for any linear transformation T : V → W , ker(T ) is a subspace of W .

22. Prove that If T1 : U → V is one-to-one, and T2 : V → W is one-to-one, then the


composition T2 ◦ T1 : U → W is also one-to-one.

23. If T1 : U → V is onto, and T2 : V → W is onto, then the composition T2 ◦ T1 : U → W


is also onto.

24. Prove that for any one-to-one linear transformation T : V → W , T −1 is also a one-to-one
linear transformation.

25. Prove that for any m × n matrix M , TA : Rn → Rm defined by

TA (v) = Av

is a linear transformation.

26. If T : V → W is a linear transformation, then prove each of the following:

• If T is one-to-one, then ker(T ) = {0}.


• If ker(T ) = {0}, then T is one-to-one.

27. If V is a finite-dimensional vector space, and T : V → V is a linear operator, then
prove that if T is one-to-one, then the range of T is all of V .

28. Prove that if T : V → W is an isomorphism between V and W (one-to-one and onto),
and B = {v1 , v2 , . . . , vn } is a basis for V , then T (B) = {T (v1 ), T (v2 ), . . . , T (vn )} is a
basis for W .

29. Prove that every vector space of dimension n is isomorphic to Rn .
∗∗
30. Let T : V → W be a one-to-one linear transformation. Prove that if dim(V ) =
dim(W ) (and both V and W are finite-dimensional), then T is an isomorphism.

Solution: Assume dim(V ) = dim(W ) and that T : V → W is one-to-one. Assume


further (in hopes of a contradiction) that T is not onto. Then there exists w ∈ W
such that w ̸∈ T (V ). But then since T (V ) is a subspace of W , but we know that

R. Borgersen
T (V ) ̸= W , it follows that rank(T ) = dim(T (V )) < dim(W ). But then, by the
Dimension Theorem,

nullity(T ) = dim(V ) − rank(T )


= dim(V ) − dim(T (V ))
> dim(V ) − dim(W )
= dim(V ) − dim(V )
= 0.

Then since nullity(T ) > 0, ker(T ) ̸= {0}, and thus there are two vectors in V that
map to the zero vector in W . Thus T is not 1:1, a contradiction.

∗∗
31. Let T1 : U → V and T2 : V → W be two linear transformations. Prove that if T2 ◦ T1
is one-to-one, then T1 must be one-to-one.

Solution: Suppose that T2 ◦ T1 : U → W is one-to-one. That is, for all u, v ∈ U , if


(T2 ◦ T1 )(u) = (T2 ◦ T1 )(v), then u = v.
It remains to show that T1 : U → V is one-to-one. Let u, v ∈ U and assume that
T1 (u) = T1 (v). Then certainly

T2 (T1 (u)) = T2 (T1 (v)),

and therefore,
(T2 ◦ T1 )(u) = (T2 ◦ T1 )(v).
Then, because (T2 ◦ T1 ) is one-to-one, it follows that u = v. Therefore T1 is one-to-
one.

Solution: Alternate Proof. Suppose that T2 ◦ T1 : U → W is one-to-one. Then


ker(T2 ◦ T1 ) = {0}. Let u ∈ ker(T1 ). Then T1 (u) = 0, and so

(T2 ◦ T1 )(u) = T2 (T1 (u))


= T2 (0) (def’n of u)
= 0. (property of linear transformations)

But then u ∈ ker(T2 ◦ T1 ), and so u = 0. Thus since u ∈ ker(T1 ) we have that


ker(T1 ) = {0}. Thus T1 is 1:1 as well.

∗∗
32. Let T : V → W be an onto linear transformation. Prove that if dim(V ) = dim(W ),
then T is an isomorphism.

R. Borgersen
Solution: Assume that dim(V ) = dim(W ). Then since T is onto, T (V ) = W , and
so rank(T ) = dim(W ) = dim(V ). But then by the dimension theorem,

nullity(T ) = dim(V ) − rank(T )


= dim(V ) − dim(V )
= 0.

Therefore Ker(T ) = {0}, and therefore by the theorem above, we have that T is 1:1.
Therefore T is both 1:1 and onto, and is thus an isomorphism.

∗∗
33. Let T : V → W be a one-to-one linear transformation. Prove that T is an isomorphism
between V and T (V ).

Solution: T is given to be 1:1. Viewing it as a linear transformation between V


and T (V ), it is also certainly onto by the definition of T (V ). Therefore it is 1:1 and
onto, and is thus an isomorphism.

∗∗
34. Let E be a fixed 2 × 2 elementary matrix.
(a) Does the formula T (A) = EA define a one-to-one linear operator on M2,2 ? Prove
or disprove.
(b) Does the formula T (A) = EA define an onto linear operator on M2,2 ? Prove or
disprove.

Solution: 1-to-1: Let A, B ∈ M2,2 be such that T (A) = T (B). Then EA = EB.
Since E is an elementary matrix, and all elementary matrices are invertible, E −1
exists. Multiplying both sides by E −1 we get E −1 EA = E −1 EB, and thus A = B.
Therefore T is 1:1.
Onto: Let A ∈ M2,2 . Then B = E −1 A is a matrix in M2,2 such that T (B) = EB = A.
Thus T is onto.
Thus T is 1:1 and onto (and is thus an isomorphism).

∗∗
35. Let B = {v1 , v2 , . . . , vn } be a basis for a vector space V , and let T : V → W be a
linear transformation. Show that if T (v1 ) = T (v2 ) = · · · = T (vn ) = 0W , then T is the
zero transformation (that is, for every v ∈ V , T (v) = 0W ).

Solution: Let v ∈ V . Then since B is a basis for V , there exist c1 , c2 , . . . , cn ∈ R


such that v = c1 v1 + . . . + cn vn .

R. Borgersen
Then,

T (v) = T (c1 v1 + . . . + cn vn )
= T (c1 v1 ) + . . . + T (cn vn )
= c1 T (v1 ) + . . . + cn T (vn )
= c1 0W + . . . + cn 0W
= 0W .

Thus T is the zero transformation.

∗∗∗
36. Let T1 : V → W and T2 : V → W be two linear transformations and let B =
{v1 , . . . , vn } be a basis for V . Prove that if for all i, 1 ≤ i ≤ n, T1 (vi ) = T2 (vi ), then
T1 = T2 (that is, for all v ∈ V , T1 (v) = T2 (v)).

Solution: Let v ∈ V . Then since B is a basis for V , there exist c1 , c2 , . . . , cn ∈ R


such that v = c1 v1 + . . . + cn vn .
Then,

T1 (v) = T1 (c1 v1 + . . . + cn vn )
= T1 (c1 v1 ) + . . . + T1 (cn vn )
= c1 T1 (v1 ) + . . . + cn T1 (vn )
= c1 T2 (v1 ) + . . . + cn T2 (vn )
= T2 (c1 v1 ) + . . . + T2 (cn vn )
= T2 (c1 v1 + . . . + cn vn )
= T2 (v).

Thus T1 = T2 .

R. Borgersen
EIGENVALUE/VECTOR AND INNER PRODUCT SPACE PROOFS

37. Let A be an n × n matrix and let λ be an eigenvalue of A. Let V be the set of


all eigenvectors corresponding to λ, together with the zero vector. Prove that V is
a subspace of Rn .

Solution: Let E be the set of all eigenvectors corresponding to λ, together with the
zero vector. Let u, v ∈ E, k ∈ R.

• Closure under addition:

A(u + v) = Au + Av
= λu + λv
= λ(u + v).

Therefore u + v is also in E.

• Closure under scalar mult:

A(ku) = kA(u)
= k(λu)
= λ(ku).

Thus ku ∈ E as well.

Therefore by the subspace theorem, E is a subspace of Rn .

38. Show that for all u, v, w in an inner product space V ,

⟨u, v + w⟩ = ⟨u, v⟩ + ⟨u, w⟩

39. Show that for all u, v in an inner product space V , and k ∈ R,

⟨u, kv⟩ = k⟨u, v⟩

40. Show that for all u, v, w in an inner product space V ,

⟨u − v, w⟩ = ⟨u, w⟩ − ⟨v, w⟩

41. Show that for all u, v, w in an inner product space V ,

⟨u, v − w⟩ = ⟨u, v⟩ − ⟨u, w⟩

42. Show that in any inner product space V , for all v ∈ V , ⟨v, 0⟩ = 0.

R. Borgersen
Solution: Let v ∈ V . Then

⟨v, 0⟩ = ⟨v, 0 · 0⟩
= 0⟨v, 0⟩
= 0.

43. Prove each of the following properties about inner product spaces: for all u, v, w in an
inner product space V , and all k ∈ R,
• ||u|| ≥ 0
• ||u|| = 0 if and only if u = 0
• ||ku|| = |k|||u||
• ||u + v|| ≤ ||u|| + ||v|| (Triangle Inequality)
• d(u, v) ≥ 0
• d(u, v) = 0 if and only if u = v
• d(u, v) = d(v, u)
• d(u, v) ≤ d(u, w) + d(w, v). (Triangle Inequality)
1 1
44. Prove that if u and v are orthogonal, then so are ||u||
u and ||v||
v.

45. Let A be an n × n matrix. Prove that A and AT have the same eigenvalues.

Solution:

|λI − A| = |(λI − A)T |


= |(λI)T − AT |
= |λI − AT |.

Thus A and AT have the same characteristic polynomials, and so must have the same
eigenvalues.

∗∗
46. Let A be an n×n matrix. Prove that A is invertible if and only if 0 is not an eigenvalue
of A.

Solution: I will solve this problem by proving the contrapositives: that A is not
invertible if and only if 0 is an eigenvalue of A.
(⇒): Assume A is not invertible. Then Ax = 0 has infinitely many solutions, and
thus at least one non-zero solution, say x0 . Then

Ax0 = 0 = 0x0

R. Borgersen
and thus 0 is an eigenvalue of A with eigenvector x0 .
(⇐): Assume 0 is an eigenvalue of A. Then det(A − 0I) = det(A) = 0. Thus A is
not invertible.

∗∗
47. Prove that if B = C −1 AC, then B and A have the same eigenvalues (HINT: Look at
the characteristic polynomials of B and A).

Solution:

|λI − B| = |λI − C −1 AC|


= |λC −1 C − C −1 AC|
= |C −1 (λC − AC)|
= |C −1 (λI − A)C|
= |C −1 | |λI − A| |C|
= |λI − A| |C −1 | |C|
= |λI − A| |C −1 C|
= |λI − A| |I|
= |λI − A|.

Thus A and B have the same characteristic polynomials, and so must have the same
eigenvalues.

∗∗
48. Let v be a nonzero vector in an inner product space V . Let W be the set of all vectors
in V that are orthogonal to v. Prove that W is a subspace of V .

Solution: Let W = {w ∈ V : ⟨w, v⟩ = 0}. Certainly W is non-empty since the zero


vector is orthogonal to every vector in V .
Let a, b ∈ W , k ∈ R.
A1. We need to check if a + b is orthogonal to v.
⟨a + b, v⟩ = ⟨a, v⟩ + ⟨b, v⟩
= 0 + 0 = 0.
Thus a + b ∈ W .
M1. We need to check if ka ∈ W .
⟨ka, v⟩ = k⟨a, v⟩
= k(0) = 0.
Thus ka ∈ W .
Therefore by the subspace theorem, W is a subspace of V .

R. Borgersen
∗∗
49. Prove that for any two vectors u and v in an inner product space, if
||u|| = ||v||,
then u + v is orthogonal to u − v.

Solution: Assume that ||u|| = ||v||. Then,

⟨u + v, u − v⟩ = ⟨u, u − v⟩ + ⟨v, u − v⟩
= ⟨u − v, u⟩ + ⟨u − v, v⟩
= ⟨u, u⟩ − ⟨v, u⟩ + ⟨u, v⟩ − ⟨v, v⟩
= ||u||2 − ⟨u, v⟩ + ⟨u, v⟩ − ||v||2
= ||u||2 − ||v||2
= ||u||2 − ||u||2
= 0.

Thus u + v is orthogonal to u − v.

∗∗
50. Let B = {v1 , v2 , . . . , vr } be a basis for an inner product space V . Show that the zero
vector is the only vector in V that is orthogonal to all of the basis vectors.

Solution: We have a property (proved elsewhere) that for all v ∈ V (and thus for
all v ∈ B), ⟨0, v⟩ = 0. But this question is in some sense the opposite of this. Let
w ∈ V be a vector such that for all v ∈ B, ⟨w, v⟩ = 0. Then we must prove that w
must have been the zero vector.
Since B is a basis for V , we know that there exist k1 , . . . , kr ∈ R such that w =
k1 v1 + k2 v2 + · · · + kr vr .
Then look at ⟨w, w⟩:

⟨w, w⟩ = ⟨w, k1 v1 + k2 v2 + · · · + kr vr ⟩
= ⟨w, k1 v1 ⟩ + ⟨w, k2 v2 ⟩ + · · · + ⟨w, kr vr ⟩
= k1 ⟨w, v1 ⟩ + k2 ⟨w, v2 ⟩ + · · · + kr ⟨w, vr ⟩
= k1 (0) + k2 (0) + · · · + kr (0)
= 0.

Thus since ⟨w, w⟩ = 0, by positivity, we know that w = 0.

∗∗∗
51. Let S = {v1 , v2 , . . . , vn } be an orthonormal basis for an inner product space V , and
u is any vector in V . Prove that
u = ⟨u, v1 ⟩v1 + ⟨u, v2 ⟩v2 + · · · + ⟨u, vn ⟩vn .

R. Borgersen
Solution: Let u ∈ V . Since S is a basis, there exist k1 , . . . , kn such that u =
k1 v1 + · · · + kn vn . Then

⟨u, vi ⟩ = ⟨k1 v1 + · · · + kn vn , vi ⟩
= ⟨k1 v1 , vi ⟩ + · · · + ⟨kn vn , vi ⟩
= k1 ⟨v1 , vi ⟩ + · · · + kn ⟨vn , vi ⟩
= ki ⟨vi , vi ⟩ (since S is orthogonal)
= ki ||vi ||2
= ki (since S is orthonormal).

Thus u = ⟨u, v1 ⟩v1 + · · · + ⟨u, vn ⟩vn .

∗∗∗
52. An n × n matrix A is said to be nilpotent if for some k ∈ Z+ , Ak is a zero matrix.
Prove that if A is nilpotent, then 0 is the only eigenvalue of A.

Solution: Let A be a nilpotent matrix and let k ∈ Z+ be such that Ak is a zero


matrix. Let λ be a eigenvalue of A with eigenvector x. Then

Ax = λx
A2 x = A(λx) = λ(Ax) = λ2 x.
....
..
Ak x = λk x.

But Ak is a zero matrix, and so the left hand side is a zero matrix. Thus λk x is a
zero matrix. However x being an eigenvector forces x ̸= 0, and thus λk = 0, and so
λ = 0. Thus 0 is the only eigenvalue of A.

∗∗∗
53. Let W be any subspace of an inner product space V , B = {b1 , . . . , bn } an orthonor-
mal basis for W . Let v ∈ V . Let the vector v0 be defined as

n
v0 = ⟨v, b1 ⟩b1 + · · · + ⟨v, bn ⟩bn = ⟨v, bi ⟩bi .
i=1

Certainly v0 ∈ W . Prove that v − v0 is orthogonal to every vector in W .

Solution: What we need to check is the inner product of v − v0 with every vector
in W and verify it is 0.

Let w ∈ W . Then since B is an orthonormal basis, w = ni=1 ⟨w, bi ⟩bi . Then

⟨v − v0 , w⟩ = ⟨v, w⟩ − ⟨v0 , w⟩,

R. Borgersen
and

n
⟨v0 , w⟩ = ⟨ ⟨v, bi ⟩bi , w⟩ (by def’n of v0 )
i=1

n
= ⟨⟨v, bi ⟩bi , w⟩ (by additivity)
i=1
∑n
= ⟨v, bi ⟩⟨bi , w⟩ (by homogeneity)
i=1

n ∑
n
= ⟨v, bi ⟩⟨bi , ⟨w, bj ⟩bj ⟩
i=1 j=1
∑n ∑
n
= ⟨v, bi ⟩ ⟨bi , ⟨w, bj ⟩bj ⟩ (by additivity)
i=1 j=1
∑n ∑n
= ⟨v, bi ⟩ ⟨w, bj ⟩⟨bi , bj ⟩ (by homogeneity)
i=1 j=1

n
= ⟨v, bi ⟩⟨w, bi ⟩⟨bi , bi ⟩ (since i ̸= j =⇒ ⟨bi , bj ⟩ = 0)
i=1

n
= ⟨v, bi ⟩⟨w, bi ⟩||bi ||2
i=1
∑n
= ⟨v, bi ⟩⟨w, bi ⟩ (since bi is a unit vector)
i=1

n
= ⟨v, ⟨w, bi ⟩bi ⟩ (by homogeneity)
i=1

n
= ⟨v, ⟨w, bi ⟩bi ⟩ (by additivity)
i=1
= ⟨v, w⟩.

Therefore, for any w ∈ W ,

⟨v − v0 , w⟩ = ⟨v, w⟩ − ⟨v0 , w⟩
= ⟨v, w⟩ − ⟨v, w⟩
= 0.

Thus v − v0 is orthogonal to everything in W .

R. Borgersen

You might also like