Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

HW2 Sol

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

E1 241: Dynamics of Linear Systems (2017a)

HW #2 Solutions

1. For an n × n real square matrix A


(i) show that the Jordan form of A and AT are the same, up to reordering of Jordan blocks.
(Hint: recall similarity transformations and permutation matrices).
(ii) if A2 = A then show that its Jordan form J is such that J 2 = J and that in fact J is a
diagonal matrix with only 1’s and 0’s on the diagonal.
(iii) Find the Jordan forms of
   
1 2 3   2 0 0
1 1
A1 = 0 4 5 , A2 = , A3 = 1 2 0 .
−1 −1
0 0 6 0 1 2

Please provide justifications.


Solution: (i) Let J := M −1 AM (with M an invertible matrix) be the Jordan form of A.
Then,
J T = M T AT (M −1 )T .

Claim (a): ∃ an invertible matrix P s.t. P −1 JP = J T .


Assuming claim (a) is true, we then have QAT Q−1 = J with Q = (P M T ) and Q invertible.
Since J is in the Jordan form and the Jordan form of a matrix is unique up to reordering of
the Jordan blocks, we conclude that the Jordan form of AT is the same as that of A (up to
reordering of the Jordan blocks).
Thus, it remains to prove that claim (a) is true.
Proof of Claim (a): Consider the ith block of J, Ji := λi I + S, where λ ∈ C and S is a
nilpotent matrix s.t. (
1, j =k+1
Skj = .
0, otherwise
Also consider the following permutation matrix of size ni × ni (of the same size as Ji )
 
0 0 ... 1
0 . . . 1 0
Pi :=  . .
 
 .. 
1 0 ... 0

For a matrix W , with compatible dimensions, the matrix Pi has the following properties
(which can be directly verified):

(Pi W )kj = W(ni −k)j


(W Pi )kj = Wk(ni −j)
Pi−1 = Pi .

Thus,

(Pi Ji Pi−1 )kj = (Pi (λi I + S)Pi )kj = (Pi (λi I + S))k(ni −j) = (λi I + S)(ni −k)(ni −j)

1
Now, notice that
(
1, ni − j = (ni − k) + 1
S(ni −k)(ni −j) =
0, otherwise
(
1, k =j+1
=
0, otherwise
= Sjk .

And since λi I is a diagonal matrix with the same entries on the diagonal, we have

Pi Ji Pi−1 = JiT .

Thus, we can permute each Jordan blocks (which by the way do not couple with each other)
and hence, there does exist a permutation matrix P s.t. Claim (a) is true.

(ii) Let P −1 AP = J. Then,

J 2 = P −1 AP P −1 AP = P −1 A2 P = P −1 AP = J.

Note that J 2 = J is a block diagonal matrix and hence the individual Jordan blocks do not
interact when computing J 2 . So, without loss of generality, we can consider a single general
Jordan block Ji and show that Ji is 1 × 1 with Ji = 1 ot Ji = 0.
Thus, we see that

Ji2 = (λi I + S)2 = λ2i I + 2λi S + S 2 .

The first term is a diagonal matrix, the second term is a matrix with 2λi on locations (i, i + 1)
and 1’s on locations (i, i + 2). But we know that J 2 = J, which has λi on the diagonal, 1’s on
locations (i, i + 1) and zeros everywhere else. Thus, we see that S 2 = 0 and

λ2i = λi ,

whose only solutions are 0 and 1. Since for either of these solutions 2λi 6= 1, we conclude that
S = 0 and hence Ji is in in fact 1 × 1, with Ji = 1 or Ji = 0.
(iii) A1 is upper-triangular. So, its eigenvalues are the diagonal elements, which are all
distinct. So, A1 is diagonalizable. So, the Jordan form of A1 is
 
1 0 0
J1 = 0 4 0 .
0 0 6

Eigenvalues of A2 are {0, 0} (trace and determinant, which give the sum and product of
eigenvalues are each zero). So, dimension of nullspace of A2 can be 1 or 2. But if the
dimension is 2 then whole of R2 would be the nullspace, which is clearly not the case as
   
1 1
A2 = .
0 −1

2
So, algebraic multiplicity of eigenvalue 0 is 2 while the geometric multiplicity is 1. So, the
Jordan form of A2 is
 
0 1
J2 = .
0 0

By part (i) A3 and AT3 have the same Jordan form up to reordering of the blocks. But AT3 is
already in the Jordan form and since the Jordan form of a matrix is unique up to reordering
of the blocks. So, the Jordan form of A3 , J3 = AT3 .

2. For the system


ẋ = A(t)x + B(t)u, x ∈ Rn , u ∈ Rm
derive the variation of constants formula, using the change of variables z(t) = Φ(t0 , t)x(t).
(Caution: in the change of variables, it is not Φ(t, t0 )).
Solution: We will assume we do not know the variation of constants formula.
Recall that in continuous time, the state transition matrix is invertible. Hence, we can rewrite
the change of coordinates as

x(t) = Φ(t0 , t)−1 z(t) = Φ(t, t0 )z(t).

From the above equation and the dynamics, we then have

A(t)Φ(t, t0 )z + Φ(t, t0 )ż = ẋ = A(t)x + B(t)u = A(t)Φ(t, t0 )z + B(t)u

Thus,

Φ(t, t0 )ż = B(t)u,

which implies

ż = Φ(t0 , t)B(t)u.

Its solution is given by


Z t
z(t) = z(t0 ) + Φ(t0 , τ )B(τ )u(τ )dτ.
t0

Changing the coordinates back, we have


Z t
x(t) = Φ(t, t0 )z(t0 ) + Φ(t, t0 )Φ(t0 , τ )B(τ )u(τ )dτ
t0
Z t
= Φ(t, t0 )x(t0 ) + Φ(t, τ )B(τ )u(τ )dτ,
t0

where we have used the fact that z(t0 ) = Φ(t0 , t0 )x(t0 ) = x(t0 ). This is the variation of
constants formula.

3. Consider
ẋ = A(t)x, x ∈ Rn

3
(i) Express Φ(t, t0 ) in terms of Φ(t, 0) := g(t), a differentiable function of time t.
(ii) Write A(t) in terms of the function g.
(iii) Repeat (i) and (ii) in the specific case of

e cos 2t e−2t sin 2t


 t 
Φ(t, 0) = .
−et sin 2t e−2t cos 2t

(iv) For the A(t) matrix from (iii), compute the eigenvalues of A(t) for a fixed t.
(v) Classify this system in terms of Lyapunov stability with justification.

For parts (iii) and (iv), you may use any symbolic math tools such as MATLAB, MATHE-
MATICA, MAPLE, or anything equivalent.
Solution: (i) First, notice that for each t, the matrix g(t) is invertible. Then,

Φ(t, t0 ) = Φ(t, 0)Φ(0, t0 ) = Φ(t, 0)(Φ(t0 , 0))−1 = g(t)(g(t0 ))−1 .

(ii) We know that Φ̇(t, 0) = A(t)Φ(t, 0). So, A(t) = ġ(t)(g(t))−1 .


(iii)

et cos(2t) − 2et sin(2t) −2e−2t sin(2t) + 2e−2t cos(2t)


 
ġ(t) =
−et sin(2t) − 2et cos(2t) −2e−2t cos(2t) − 2e−2t sin(2t)

1 e−2t cos(2t) −e−2t sin(2t)


  −t
e cos(2t) −e−t sin(2t)
 
−1
(g(t)) = −t = 2t
e et sin(2t) et cos(2t) e sin(2t) e2t cos(2t)

So (from MATLAB),
" #
3 cos(4t)
2 − 12 2 − 3 sin(4t)
2
A(t) = −3 sin(4t) .
2 −2 − 3 cos(4t)
2 − 21

(iv) Again from MATLAB, the eigenvalues of A(t) for any given t are:

1 7
− ± i.
2 2
So, for any fixed t, the matrix A(t) is a “stable” matrix.
(v) Consider the initial condition x(0) = [1 0]T . Then,
 t   
e cos 2t ∞
lim x(t) = lim Φ(t, 0)x(0) = lim t = .
t→∞ t→∞ t→∞ −e sin 2t 0

So, the homogeneous LTV system is unstable.

4. (Stability margin). Consider the continuous-time LTI system

ẋ = Ax, x ∈ Rn

and suppose that there exists a positive constant µ and positive-definite matrices P and
Q ∈ Rn×n such that
AT P + P A + 2µP = −Q.

4
Show that all eigenvalues of A have real parts less than −µ. A matrix A with this property
is said to be asymptotically stable with stability margin µ.
Solution: We can rewrite the equation in the question as

(A + µI)T P + P (A + µI) = −Q < 0,

where the notation < 0 means a negative-definite matrix. Then, by the Lyapunov stability
theorem we know that the matrix (A + µI) is Hurwitz (meaning all its eigenvalues have
negative real parts).
But notice that if λ is an eigenvalue of the matrix (A + µI) with eigenvector x then, that is,

(A + µI)x = λx,

then

Ax = (λ − µ)x,

that is λ − µ is an eigenvalue of the matrix A with eigenvector x. Since all eigenvalues λ of


(A + µI) are such that Re{λ} < 0 we have that all eigenvalues of A have real parts strictly
smaller than −µ.

You might also like