Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Linear Algebra (Solved Problems)

Download as pdf or txt
Download as pdf or txt
You are on page 1of 20

Linear Algebra. Degrees in Engineering.

Solved problems.

Chapters 1 & 2.
Questions

Q1.- Is the following statement true or false? Justify your answer by citing appropriate facts or
theorems if true or, if false, explain why or give a counterexample:

If a set is linearly dependent, then the set contains more vectors that there are entries
in the vectors.

Answer: It is false. The set    


 1 2 
S=  0 ,
  0 
0 0
 

is dependent (the second vector is two times the first one) and has only two R3 vectors.

Q2.- If A, B and C are n × n invertible matrices, does the equation C −1 (A + X)B −1 = In have
a solution? (In is the n × n identity matrix). If so, find it.
Answer: Multiplying by C on the left and by B on the right we get A + X = CB, from where
we find X = CB − A.

Q3.- Is the following statement true or false? Justify your answer by citing appropriate facts or
theorems if true or, if false, explain why or give a counterexample:

If a set contains fewer vectors that there are entries in the vectors, then the set is
linearly independent.

Answer: It is false. The set    


 1 2 
S =  0 , 0 
0 0
 

has two R3 vectors and is dependent (the second vector is two times the first one).

Q4.- Suppose A and B are n × n matrices, B is invertible, and AB is invertible. Show that A
is invertible.
Answer: Let us call C = AB. By hypothesis, B is invertible, and we can multiply by both sides
of the previous equation by B −1 . We thus obtain CB −1 = A. Therefore, A is the product of two
invertible matrices and hence it is itself invertible.

Q5.- Let B, C and D be matrices. If BC = BD, then C = D.

Answer: False. If B is not invertible, it can not be taken out of the equation.

Q6.- If A is a 3 × 3 matrix and the equation Ax = (1 0 0)T has a unique solution, then A is
invertible.

Answer: True. For the system to have a unique solution, A must have 3 pivots and, being 3 × 3
it is henceforth invertible.

1
Problems

P1.- Let T : R3 −→ R2 , given by


 
x1  
x1 − 5x2 + 4x3
T   x2  =
x2 − 6x3
x3

Show that T is a linear transformation by finding a matrix that implements it and discuss
whether T is onto, one-to-one, both or none of them.
Answer: To find the matrix one has to compute T (e1 ), T (e2 ) and T (e3 ), i.e., the transformation
of the columns of the identity, which are straightforwardly obtained as
     
1 −5 4
T (e1 ) = , T (e2 ) = , T (e3 ) = ,
0 1 −6

and hence the matrix is given by


 
1 −5 4
A=
0 1 −6

The matrix has obviously two pivots, so the system Ax = b will always be consistent, and
the transformation is onto (its range is all of R2 ). It cannot be one-to-one because two many
vectors are mapped to the same one in R2 , as the system will always have a free variable.

P2.- Let A be given below. Compute A−1 .


 
1 0 −2
A =  −3 1 4 
2 −3 4

By row operations one finds  


8 3 1
A−1 =  10 4 1 
7/2 3/2 1/2

P3.- Let T : R3 −→ R3 , with T (x) = Ax, and


 
1 −4 2
A= 0 3 5 
−2 8 −4

Find Range(T ) and discuss whether T is onto, one-to-one, both or none of them.
Answer: To find Range(T ) is to find the set of vectors b ∈ R3 such that the equation Ax = b
has a solution. The augmented matrix of the system is
   
1 −4 2 b1 1 −4 2 b1
 0 3 5 b2   0 3 5 b2 
−2 8 −4 b3 0 0 0 (b3 + 2b1 )

Therefore, consistency of the system requires b3 + 2b1 = 0 ⇒ b1 = −b3 /2, b2 free, and
    
 −1 0 
Range(T ) = Span   0  ,  1  
2 0
 

2
T is neither onto (its range is not all of R3 ) nor one-to-one (the system has an infinite number
of solutions and therefore an infinite number of vectors that share every image).

P4.- Let A be given below. Discuss without any calculation why it should be invertible. Compute
A−1 .  
5 0 0
A =  −3 1 0 
8 5 −1
The matrix is invertible because its columns form clearly an independent set. By row operations
one finds  
1/5 0 0
A−1 =  3/5 1 0 
23/5 5 −1
P5.- Let A be given below. Compute A−1 .
 
1 2 3
A =  1 1 2 .
0 1 2

Answer: The matrix is invertible because its columns form clearly an independent set. By row
operations one finds  
0 1 −1
A−1 =  2 −2 −1  .
−1 1 1

P6.- Let A be given below. Compute A−1 .


 
0 1 2
A= 1 0 3 .
4 −3 8

Answer: The matrix is invertible because its columns form clearly an independent set. By row
operations one finds  
−9/2 7 −3/2
A−1 =  −2 4 −1  .
3/2 −2 1/2

3
Linear Algebra. Degrees in Engineering.
Solved problems.

Chapters 3 & 4.
Questions

Q1.- Let V and W be vector spaces, and let T : V → W be a linear transformation. Given a
subspace Z ⊂ W , let U be the set of all x ∈ V such that T (x) ∈ Z. Show that U is a subspace
of V .
Answer:
We need to show that ∀α, β ∈ R and ∀x, y ∈ U , αx + βy ∈ U . If x, y ∈ U , the definition
of U implies that there T (x), T (y) ∈ Z. Therefore, using linearity of T and the fact that Z is a
subspace, we know that
αT (x) + βT (y) = T (αx + βy) ∈ Z
and hence αx + βy ∈ U as required.

Q2.- Let A be an invertible n × n matrix. Show that det(A−1 )=1/det(A).


Answer:
We know that A · A−1 = In , In being the n × n identity matrix. We can then apply the
property of the determinant of a product of matrices to find

det(A · A−1 ) = det(In ) = 1;


1
det(A)det(A−1 ) = 1 ⇒ det(A−1 ) = .
det(A)

Q3.- Let V and W be vector spaces, and let T : V → W be a linear transformation. Given a
subspace U ⊂ V , let T (U ) denote the set of all images of the form T (x), where x ∈ U . Show
that T (U ) is a subspace of W .
Answer:
From the definition we see that vectors of T (U ) are of the form T (x). Therefore, for T (U )
to be a subspace, it is needed that ∀α, β ∈ R and ∀x, y ∈ U , αT (x) + βT (y) ∈ T (U ). Using
linearity of T :
αT (x) + βT (y) = T (αx + βy)
and as U is a subspace, αx + βy ∈ U , which implies that T (αx + βy) ∈ T (U ) as required.

Q4.- Is the set of points inside and on the unit circle in the xy-plane,
  
x
H= : x2 + y 2 ≤ 1 ,
y

a subspace of R2 ? Justify your answer.


Answer:
No, because neither the addition nor the product by scalars are closed in H. For instance,
     
1 0 1
+ = 6∈ H
0 1 1

whereas the two vectors in the left-hand-side do belong to H.

4
Q5.- If {u, v, w} are linearly independent vectors, then u, v, w 6∈ R2 .

Answer: True. If the vectors belonged to R2 , they would have two components, and a set with
more vectors than components has each vector is necessarily linearly dependent. Therefore, if
they are independent, they must have at least three components.

Q6.- Let A be a m × n matrix, row equivalent to an echelon form with k nonzero rows. Then
the dimension of the space of solutions of A is m − k.

Answer: False. m is the number of equations, n is the number of variables. If there are k nonzero
rows, there are k pivots, and hence n − k free variables. Then the dimension of the space of
solutions of A is n − k.

Q7.- If A is an m × n matrix with m pivot columns, then the linear transformation x 7→ Ax is


a one-to-one mapping.
Answer: False. A corresponds to a system of m equations and n unknowns, and n ≥ m (otherwise
A cannot have m pivots). Therefore, for any independent term (i.e., any image vector) there are
an infinite number of solutions, with n − m free parameters. Only in the special case m = n
would A be one-to-one.

Q8.- If B = {b1 , . . . , bn } and C = {c1 , . . . , cn } are bases for a vector space V , then the jth
column of the change of coordinates matrix PC←B is the coordinate vector of cj in the basis B.

Answer: False. The jth column of the change of coordinates matrix PC←B is the coordinate
vector of bj in the basis C.

Problems

P1.- Let S = {v1 , v2 , v3 } and T = {w1 , w2 , w3 } be basis for R3 , where


           
2 1 1 6 4 5
v1 =  0  , v2 =  2  , v3 =  1  , w1 =  3  , w2 =  −1  , w3 =  5  .
1 0 1 3 3 2

Compute the change of basis matrix from T to S.


Answer: To compute the matrix we need the coordinates of the vectors of basis T on basis S.
This means we need to solve three linear systems:

α1i v1 + α2i v2 + α3i v3 = wi , i = 1, 2, 3.

We can solve the three systems at the same time by writing an augmented matrix with the three
wi vectors as independent terms, as follows:
 
2 1 1 6 4 −5
 0 2 1 3 −1 5 .
1 0 1 3 3 2

This matrix is reduced by row operations to its reduced echelon form, which turns out to be
 
1 0 0 2 2 1
 0 1 0 1 −1 2  ,
0 0 1 1 1 1

5
and therefore the change of basis matrix is
 
2 2 1
PS←T =  1 −1 2  .
1 1 1

P2.- Let   
 x1 
H =  x2  : x1 − x2 + x3 = 0 .
x3
 

a subspace of R3 . Find a basis of H and subsequently write the vector


 
1
v= 2 
1

in coordinates in that basis.


Answer: H is the null space of matrix A = (1 − 2 1), which is already in reduced echelon form,
and therefore it has one pivot and two free variables. The solution of the homogeneous equation
is x1 = 2x2 − x3 which leads to the following basis for Nul(A), {v1 , v2 }, with
   
2 −1
v1 =  1  , v2 =  0  .
0 1

To find the coordinates of v we have to solve v = α1 v1 + α2 v2 , or equivalently we have the


augmented matrix
       
2 −1 1 2 −1 1 2 −1 1 1 0 1
 1 0 2 ∼ 0 1 1 ∼ 0 1 1 ∼ 0 1 1  ⇒ α1 = α2 = 1
0 1 1 0 1 1 0 0 0 0 0 0

The coordinates of v in this basis are then


 
1
[v]B = .
1

P3.- Compute the determinant of the matrix


 
2 5 4 1
 4 7 6 2 
A=  6 −2 −4 0
.

−6 7 7 0

With that result and without any further computations, discuss how many solutions does the
system Ax = b have and when is it consistent.
Answer:
2 5 4 1 2 5 4 1

4 0 −3 −2
7 6 2 0 −3 −2 0

6 −2 −4 0 = 6 −2 −4 0 = 6 −2 −4 =

−6 7 7
−6 7 7 0 −6 7 7 0

0 −3 −2
−3 −2
= 6 −2 −4 = −(−6)
=6
0 5 3
5 3

6
As det(A) is nonzero, A is invertible, and therefore the system Ax = b has a unique solution
for all b.

P4.- Assume that the matrix A is row equivalent to B, where A and B are
   
1 −4 9 −7 1 0 −1 5
A =  −1 2 −4 1  , B =  0 −2 5 −6  .
5 −6 10 7 0 0 0 0
Without calculations, list rank(A) and dim(Nul(A)). Then find bases for Col(A), Row(A) and
Nul(A).
Answer:
Inspection of B allows to conclude that A has two pivot columns and Ax = 0 has two free
variables, and hence rank(A)=2 and dim(Nul(A))=2.
The basis for Col(A) is given by the first two columns of A, the pivot columns, so
   
 1 −4 
 −1  ,  2 
5 −6
 

is the requested basis. As for the row space, we can directly use the nonzero rows in the echelon
form B, finding the basis    

 1 0 
 0
   −2 
,  .
  −1   5 
 
5 −6
 

Finally, for Nul(A), we need the reduced echelon form, which can be obtained from B:
 
1 0 −1 5
A ∼ B ∼  0 1 −5/2 3  ;
0 0 0 0

x1 − x3 + 5x4 = 0
x2 − (5/2)x3 + 3x4 = 0

and from here we find the solution


       
x1 x3 − 5x4 1 −5
 x2   (5/2)x3 − 3x4  5/2 
 + x4  −3  ,
  
 =  = x3 
 x3   x3   1   0 
x4 x4 0 1

the last two vectors being the basis for Nul(A).

P5.- Let T : R3 −→ R3 , with T (x) = Ax, and


 
1 0 1
A =  0 2 5 .
2 2 7

Find Range(T ) and discuss whether T is onto, one-to-one, both or none of them.
Answer: To find Range(T ) is to find the set of vectors b ∈ R3 such that the equation Ax = b
has a solution. The augmented matrix of the system is
   
1 0 1 b1 1 0 1 b1
 0 2 5 b2   0 2 5 b2 .
2 2 7 b3 0 0 0 (b3 − 2b1 − b2 )

7
Therefore, consistency of the system requires b3 − 2b1 − b2 = 0 ⇒ b1 = (b3 − b2 )/2, b2 and b3
free, and    
 1 −1 
Range(T ) = Span   0  ,  2   .
2 0
 

T is neither onto (its range is not all of R3 ) nor one-to-one (the system has an infinite number
of solutions and therefore an infinite number of vectors that share every image b).

P6.- Assume that the matrix A is row equivalent to B, where A and B are
   
1 2 3 1 1 0 5 1
A=  1 3 2 1  , B=  0 1 −1 0 .
3 8 7 3 0 0 0 0

Without calculations, list rank(A) and dim(Nul(A)). Then find bases for Col(A), Row(A) and
Nul(A).
Answer:
Inspection of B allows to conclude that A has two pivot columns and Ax = 0 has two free
variables, and hence rank(A)=2 and dim(Nul(A))=2.
The basis for Col(A) is given by the first two columns of A, the pivot columns, so
   
 1 2 
 1 , 3 
3 8
 

is the requested basis. As for the row space, we can directly use the nonzero rows in the echelon
form B (using the pivot rows of A would be equally correct), finding the basis
   

 1 0  
0 1
   
 ,
 5   −1  .


 
1 0
 

Finally, for Nul(A), we need the reduced echelon form, but B is already in reduced echelon form
and therefore we can read the solution directly:
       
x1 −5x3 − x4 −5 −1
 x2   x3  = x3  1  + x4  0  ,
    
 x3  = 
  
x3   1   0 
x4 x4 0 1

the last two vectors being the basis for Nul(A).

P7.- Let S = {v1 , v2 } and T = {w1 , w2 } be basis for R2 , where


       
1 0 1 2
v1 = , v2 = , w1 = , w2 = .
2 1 1 3

Compute the change of basis matrix from T to S.


Answer: To compute the matrix we need the coordinates of the vectors of basis T on basis S.
This means we need to solve two linear systems:

α1i v1 + α2i v2 = wi , i = 1, 2.

8
We can solve the two systems at the same time by writing an augmented matrix with the two
wi vectors as independent terms, as follows:
 
1 0 1 2
.
2 1 1 3

This matrix is reduced by row operations to its reduced echelon form, which turns out to be
 
1 0 1 2
,
0 1 −1 −1

and therefore the change of basis matrix is


 
1 2
PS←T = .
−1 −1

P8.- Let T : R3 −→ R3 , with T (x) = Ax, and


 
1 0 1
A =  0 2 5 .
0 0 3

Find Range(T ) and discuss whether T is onto, one-to-one, both or none of them.
Answer: To find Range(T ) is to find the set of vectors b ∈ R3 such that the equation Ax = b
has a solution. The matrix is already in echelon form and it obviously has three pivots, hence it
has a unique solution for any b ∈ R3 . Hence, Range(T )=b ∈ R3 .
T is both onto (its range is all of R3 ) and one-to-one (the system has a unique solution and
therefore a unique vector for every image b).

P9.- Assume that the matrix A is row equivalent to B, where A and B are
   
−3 9 −2 −7 1 −3 6 9
A=  2 −6 4 8 , B=
  0 0 4 5 .
3 −9 −2 2 0 0 0 0

Without calculations, list rank(A) and dim(Nul(A)). Then find bases for Col(A), Row(A) and
Nul(A).
Answer:
Inspection of B allows to conclude that A has two pivot columns and Ax = 0 has two free
variables, and hence rank(A)=2 and dim(Nul(A))=2.
The basis for Col(A) is given by the first and third columns of A, the pivot columns, so
   
 −3 −2 
 2 , 4 
3 −2
 

is the requested basis. As for the row space, we can directly use the nonzero rows in the echelon
form B (using the pivot rows of A would be equally correct), finding the basis
   

 1 0  
−3   0 

 ,  .
  6   4 
 
9 5
 

9
Finally, for Nul(A), we need the reduced echelon form, which is
 
1 −3 0 3/2
 0 0 1 5/4  .
0 0 0 0

and therefore the solution is:


       
x1 3x2 − 3x4 /2 3 −3/2
 x2   x2  = x2  1  + x4  0  ,
    
 x3  =  −5x4 /4
  
  0   −5/4 
x4 x4 0 1

the last two vectors being the basis for Nul(A).

P10.- Let S = {v1 , v2 } and T = {w1 , w2 } be basis for R2 , where


       
1 2 1 1
v1 = , v2 = , w1 = , w2 = .
0 1 1 2

Compute the change of basis matrix from T to S.


Answer: To compute the matrix we need the coordinates of the vectors of basis T on basis S.
This means we need to solve two linear systems:

α1i v1 + α2i v2 = wi , i = 1, 2.

We can solve the two systems at the same time by writing an augmented matrix with the two
wi vectors as independent terms, as follows:
 
1 2 1 1
.
0 1 1 2

This matrix is reduced by one row operation to its reduced echelon form, which turns out to be
 
1 0 −1 −3
,
0 1 1 2

and therefore the change of basis matrix is


 
−1 1
PS←T = .
−3 2

10
Linear Algebra. Degrees in Engineering.
Solved problems.

Chapter 5.
Questions

Q1.- A is a 7 × 7 matrix with three distinct eigenvalues. One eigenspace is two-dimensional, and
one of the other eigenspaces is three-dimensional. Is it possible that A is not diagonalizable?
Justify your answer.
Answer:
The remaining eigenspace is at least one-dimensional, but it can also be two-dimensional,
because the matrix, being 7×7 can have up to 7 independent eigenvectors. In case the remaining
eigenspace is one-dimensional it does not complete a basis of R4 formed by eigenvectors, and
hence A may not be diagonalizable.

Q2.- Show that if x is an eigenvector of the matrix product AB and Bx 6= 0, then Bx is an


eigenvector of BA.
Answer: As x is an eigenvector of the matrix product AB, we have ABx = λx. Multiplying by
B on the left, we have BABx = Bλx = λBx, i.e., BA(Bx) = λ(Bx), as requested.

Q3.- A is a 4 × 4 matrix with three distinct eigenvalues. One eigenspace is one-dimensional,


and one of the other eigenspaces is two-dimensional. Is it possible that A is not diagonalizable?
Justify your answer.
Answer:
The remaining eigenspace is at least one-dimensional, and in fact it cannot have a larger
dimension because as the matrix is 4 × 4 it can have at most four independent eigenvectors.
Therefore the remaining eigenspace is one-dimensional and it completes a basis of R4 formed by
eigenvectors, and hence A is necessarily diagonalizable.

Q4.- Let A be a n × n matrix and v1 , v2 two of its eigenvectors. Is v1 + v2 an eigenvector of


A? Justify your answer.
Answer: No, if the corresponding eigenvalues are different:

A(v1 + v2 ) = Av1 + Av2 = λ1 v1 + λ2 v2 6= λ(v1 + v2 ).

For the special case they correspond to the same eigenvalue, the sum is an eigenvector with the
same eigenvalue (apply the calculation above or recall that they belong to the same eigenspace).

Q5.- Two eigenvectors with the same eigenvalue are always linearly dependent.

Answer: False. For instance, any vector is an eigenvector of the identity matrix with eigenvalue
1, because Ix = x, and they can be chosen independent.

Q6.- If A is an n × n diagonalizable matrix, then each vector in Rn can be written as a linear


combination of eigenvectors A.

True. For A to be diagonalizable it has to have n independent eigenvectors, which in turn is


a basis for Rn , hence the result.

11
Problems

P1.- Diagonalize the matrix  


1 2 −1
A= 1 0 1 .
4 −4 5
Answer:
The characteristic equation is

1−λ 2 −1

A= 1 λ 1 = −(1 − λ)λ(5 − λ) + 12 − 4λ + 4(1 − λ) − 2(5 − λ) =
4 −4 5 − λ
= −(1 − λ)λ(5 − λ) + 6(1 − λ) =
= −(1 − λ)[−λ(5 − λ) + 6] =
= −(1 − λ)(λ2 − 5λ + 6) = −(1 − λ)(2 − λ)(3 − λ)

Therefore the eigenvalues are 1, 2 and 3 and the matrix is diagonalizable. The corresponding
eigenvectors are obtained by solving for each eigenvalue the homogeneous system (A − λI)x = 0,
finding      
−1 −2 −1
v1 =  1  , v2 =  1  , v3 =  1 
2 4 4
as representative eigenvectors for 1, 2 and 3, respectively. We can now form the matrix P =
(v1 v2 v3 ) and then  
1 0 0
A = P DP −1 = P  0 2 0  P −1 .
0 0 3

P2.- Let A and x0 be    


3 5 1
A= , x0 = .
3 1 −1
Find the general solution of xk+1 = Axk for x0 given above. Is the origin an attractor, a repellor
or a saddle point? What happens to xk as k → ∞?
Answer: The characteristic equation for A is (3− λ)(1− λ)− 15 = λ2 − 4λ+ 12 = 0, and therefore
the eigenvalues are −2 and 6. Solving the corresponding homogeneous systems (A − λI)x = 0
leads to the eigenvectors    
1 5
v1 = , v2 =
−1 3
respectively. We thus see that x0 = v1 and hence x1 = Ax0 = 6x0 , and so on. The general
solution is then
2k
 
xk = .
−2k
As both eigenvalues are larger than 1 in absolute value, the origin is a repellor. Indeed, it can
be seen that xk goes to ∞ as k → ∞.

P3.- Diagonalize the matrix  


−2 2 2
A =  2 −2 2 .
2 2 −2

12
Answer:
The characteristic equation is

−2 − λ 2 −2
= (−2 − λ)3 + 16 − 12(−2 − λ) =

A = 2 −2 − λ 2
−2 −2 2 − λ
= −8 − 12λ − 6λ2 − λ3 + 40 + 12λ
= −λ3 − 6λ2 + 32 = 0

Therefore the eigenvalues are 2 and −4 (twice) and the matrix is diagonalizable. The corres-
ponding eigenvectors are obtained by solving for each eigenvalue the homogeneous system
(A − λI)x = 0, finding
     
1 1 1
v1 =  1  , v2 =  −1  , v3 =  0 
1 0 −1

as representative eigenvectors for 2, −4, respectively. We can now form the matrix P = (v1 v2 v3 )
and then  
2 0 0
A = P DP −1 = P  0 −4 0  P −1 .
0 0 −4

P4.- Let A and x0 be    


3 5 1
A= , x0 = .
3 1 −1
Find the general solution of x′ = Ax for x0 given above. Is the origin an attractor, a repellor or
a saddle point? What happens to x(t) as t → ∞?
Answer: The characteristic equation for A is (3− λ)(1− λ)− 15 = λ2 − 4λ+ 12 = 0, and therefore
the eigenvalues are −2 and 6. Solving the corresponding homogeneous systems (A − λI)x = 0
leads to the eigenvectors    
1 5
v1 = , v2 =
−1 3
respectively. The general solution is then

x(t) = c1 e−2t v1 + c2 e6t v2 .

We now impose the initial condition x(0) = x0 :


     
1 3 1
x(0) = c1 + c2 = x0 = ,
−1 −5 −1

from which we trivially see that c1 = 1 and c2 = 0. The solution is therefore


 
−2t 1
x(t) = e .
−1

As one eigenvalue is positive and the other is negative, the origin is a saddle point. Indeed,
it can be seen that x(t) goes to the origin as k → ∞ along the direction of v1 .

13
P5.- Compute the eigenvalues of the matrix
 
0 2 2
A =  2 0 2 .
2 2 0

Discuss whether the matrix is diagonalizable or not without computing its eigenvectors.
Answer:
The characteristic equation is

−λ 2 2
2 = (λ + 2)2 (λ − 4) = 0

det(A − λI) = 2 −λ

2 2 −λ

Therefore the eigenvalues are −2 (twice) and 4. In spite of having a double eigenvalue, we can
be sure that the matrix is diagonalizable because it is symmetric.

P6.- The eigenvalues of the matrix


 
0 0 1
A =  0 1 2 .
0 0 1

are 0 and 1 (twice). Compute its eigenvectors and discuss whether the matrix is diagonalizable
or not.
Answer:
The eigenvectors are obtained by solving for each eigenvalue the homogeneous system (A −
λI)x = 0, finding    
1 0
v1 =  0  , v2 =  1 
0 0
as representative eigenvectors for 0, 1, respectively. As we have only two independent eigenvec-
tors, the matrix is not diagonalizable.

P7.- Compute the eigenvalues of the matrix


 
1 3 3
A =  −3 −5 −3  .
3 3 1

Discuss whether the matrix is diagonalizable or not without computing its eigenvectors.
Answer:
The characteristic equation is

1−λ 3 3
= −(λ − 1)(λ + 2)2 = 0

det(A − λI) = −3 −5 − λ −3
3 3 1−λ

Therefore the eigenvalues are −2 (twice) and 1. We cannot say whether the matrix is diagona-
lizable or not without computing its eigenvalues, because we do not know whether −2 has two
eigenvectors and the matrix is not symmetric.

14
P8.- The eigenvalues of the matrix
 
2 4 3
A =  −4 −6 −3  .
3 3 1

are 1 and -2 (twice). Compute its eigenvectors and discuss whether the matrix is diagonalizable
or not.
Answer:
The eigenvectors are obtained by solving for each eigenvalue the homogeneous system (A −
λI)x = 0, finding    
1 −1
v1 =  −1  , v2 =  1 
1 0
as representative eigenvectors for 1, −2, respectively. As we have only two independent eigen-
vectors, the matrix is not diagonalizable.

15
Linear Algebra. Degrees in Engineering.
Solved problems.

Chapter 6.
Questions

Q1.- Given u1 , u2 and u3 ∈ Rn , with u3 ∈ Span({u1 , u2 }), what is the projection of u3 onto
Span({u1 , u2 })? Justify your answer.
Answer: The projection is u3 itself. By the theorem of the best approximation, the projection
is the best approximation to a given vector within a given subspace, and as in this case vector
u3 is in the subspace, it is the best approximation to itself, and hence its own projection.

Q2.- Let U and V be n × n orthogonal matrices. Show that U V is also orthogonal.


Answer: An orthonormal square matrix verifies that its transpose is its inverse. U V is also n × n.
Then,
(U V )T (U V ) = (V T U T )(U V ) = V T U T U V = V T (U T U )V = V T V = I
as required.
bigskip
Q3.- Given u1 , u2 and u3 ∈ Rn , with u3 6∈ Span({u1 , u2 }), how could you find a vector that is
orthogonal to Span({u1 , u2 })? Justify your answer.
Answer: By substracting from u3 its projection onto Span({u1 , u2 }). By definition of projection,
the difference between a vector and its projection must be orthogonal to the subspace onto which
it is projected.

Q4.- Find a formula that yields explicitly the least-squares solution of Ax = b when the columns
of A are orthonormal.
Answer: As the columns of A are orthonormal, we have AT A = I. Therefore the normal equations
AT Ax̂ = AT b become simply x̂ = AT b.

Q5.- If ||u − v||2 = ||u||2 + ||v||2 , then u and v are orthogonal.

Answer: True. The Pythagorean theorem states ||u + v||2 = ||u||2 + ||v||2 ⇔ u and v are
orthogonal. Applying it to u and −v and using ||v|| = || − v|| we have the result above.

Q6.- If W is a subspace, then ||projW v||2 + ||v − projW v||2 = ||v||2


Answer: True. The Pythagorean theorem states ||u + v||2 = ||u||2 + ||v||2 ⇔ u and v are
orthogonal. Applying it to V and v − projW v which are orthogonal by construction we have the
result above.

Problems

P1.- Find a QR factorization of  


1 1 0
 1 −1 0 
A=
 1
.
1 1 
1 1 1

16
Answer:

We apply the Gram-Schmidt orthogonalization procedure (we refer to the columns of A as


x1 , x2 , x3 ):
 
1
 1 
v1 = x1 =   1 

1
 
1
x2 · v1 1  −3 
v2 = x2 − v1 =  
v1 · v1 2 1 
1
 
−2
x3 · v1 x3 · v2 1 0 
v3 = x3 − v1 − v2 =  
v1 · v1 v2 · v2 3 1 
1
Normalizing the so obtained vectors we can form Q
 √ √ 
1/2 √ 3/6 − 6/3
1 1/2 −√3/2 √ 0
 
Q= √  ,
2 3  1/2 √ 3/6 √6/6

1/2 3/6 6/6
and finally  
√ 2 √1 √ 1
R = QT A = 3  0 3 √3/3  .
0 0 6/3

P2.- Given    
1 1 2
A =  1 2 , b =  2 ,
1 3 4
find a least-squares solution of Ax = b and the orthogonal projection of b onto Col(A).
Answer:
We compute    
T 3 6 T 8
A A= ,A b = ,
6 14 18
and with this we solve the normal equations AT Ax̂ = AT b to obtain
 
2/3
x̂ = .
1
The orthogonal projection of b onto Col(A) is then simply Ax̂, yielding
 
5/3
projCol(A) b =  8/3  .
11/3

P3.- Find a QR factorization of


 
1 2 −1
 1 −1 2 
A=
 1 −1
.
2 
−1 1 1

17
Answer:

We apply the Gram-Schmidt orthogonalization procedure (we refer to the columns of A as


x1 , x2 , x3 ):
 
1
 1 
v1 = x1 =   1 

−1
 
9
x2 · v1 1  −3 
v2 = x2 − v1 =  
v1 · v1 4  −3 
3
 
0
x3 · v1 x3 · v2  1 
v3 = x3 − v1 − v2 =  
 1 
v1 · v1 v2 · v2
2
Normalizing the so obtained vectors we can form Q
 √ 
1/2 √3/2 √ 0
1  1/2 − 3/6
√ √6/6  ,

Q= √ 
2 3  1/2 −√3/6 √6/6 
−1/2 3/6 6/3
and finally  
√ 2 √−1/2 √1
R = QT A = 3  0 3 3/2 −√3  .
0 0 6

P4.- Given    
1 5 3
A=  2 −2  , b=  2 ,
−1 1 5
find a least-squares solution of Ax = b and the orthogonal projection of b onto Col(A).
Answer:
We compute    
6 0 2
AT A = , AT b = ,
0 30 16
and with this we solve the normal equations AT Ax̂ = AT b to obtain
 
1/3
x̂ = .
8/15
The orthogonal projection of b onto Col(A) is then simply Ax̂, yielding
 
3
projCol(A) b =  −2/5  .
1/5

P5.- Find the matrix Q corresponding to the QR factorization of


 
1 0 2
A =  −1 2 0 .
−1 −2 2

18
Answer:

We apply the Gram-Schmidt orthogonalization procedure (we refer to the columns of A as


x1 , x2 , x3 ):
 
1
v1 = x1 =  −1 
−1
 
0
x2 · v1
v2 = x2 − v1 = x2 =  2 
v1 · v1
−2
 
2
x3 · v1 x3 · v2
v3 = x3 − v1 − v2 =  1 
v1 · v1 v2 · v2
1
Normalizing the so obtained vectors we can form Q
 √ √ 
1/√3 √0 2/√6
1 
Q= √ −1/√3 1/√2 1/√6  .
2 3
−1/ 3 −1/ 2 1/ 6

P6.- Given    
1 2 1
A=  1 2  , b=  0 ,
1 1 1
find a least-squares solution of Ax = b.
Answer:
We compute    
T 3 5 T 2
A A= ,A b = ,
5 9 3
and with this we solve the normal equations AT Ax̂ = AT b to obtain
 
3/2
x̂ = .
−1/2

P7.- Find the matrix Q corresponding to the QR factorization of


 
1 1 1
A =  0 1 1 .
0 0 1

Answer:

We apply the Gram-Schmidt orthogonalization procedure (we refer to the columns of A as


x1 , x2 , x3 ):
 
1
v1 = x1 =  0 
0
 
0
x2 · v1
v2 = x2 − v1 = x2 =  1 
v1 · v1
0
 
0
x3 · v1 x3 · v2
v3 = x3 − v1 − v2 =  0 
v1 · v1 v2 · v2
1

19
There is no need for normalizing as the obtained vectors have already unit norm, so we can form
Q  
1 0 0
Q =  0 1 0 .
0 0 1

P8.- Given    
2 3 7
A =  2 4 , b =  3 ,
1 1 1
find a least-squares solution of Ax = b.
Answer:
We compute    
T 9 15 T 21
A A= ,A b = ,
15 26 34
and with this we solve the normal equations AT Ax̂ = AT b to obtain
 
4
x̂ = .
−1

20

You might also like