AM32LC4 Stability PDF
AM32LC4 Stability PDF
AM32LC4 Stability PDF
Stability
Topics :
1. Basic Concepts
113
114 AM3.2 - Linear Control
Stability was probably the first question in classical dynamical systems which was
dealt with in a satisfactory way. Stability questions motivated the introduction of
new mathematical concepts (tools) in engineeering, particularly in control engineering.
Stability theory has been of interest to mathematicians and astronomers for a long
time and has had a stimulating impact on these fields. The specific problem of
attempting to prove that the solar system is stable accounted for the introduction of
many new methods.
Our treatment of stability will apply to (control) systems described by sets of
linear or nonlinear equations. As is to be expected, however, our most explicit results
will be obtained for linear systems.
C.C. Remsing 115
Note : We shall assume that the components Fi are continuous and satisfy stan-
dard conditions, such as having continuous first order partial derivatives so that the
solution curve of (4.1) exists and is unique for any given initial condition (state).
From a geometric point of view, the right-hand side (rhs) F can be interpreted as
a time-dependent vector field on Rm . So a (nonlinear) dynamical system is essentially
the same as (and thus can be identified with) a vector field on the state space. This
point of view is very fruitfull and extremely useful in investigating the properties of
the dynamical system (especially when the state space is a manifold).
Equilibrium states
for all t t0 .
We shall also assume that there is no other constant solution curve in the
neighborhood of the origin, so this is an isolated equilibrium state.
If friction is taken into account then the oscillatory motions steadily decrease
until the equilibrium state is returned to.
Stability
(a) stable if for any positive scalar there exists a positive scalar
such that kx(t0 )k < implies kx(t)k < for all t t0 .
(c) unstable if it is not stable; that is, there exists an > 0 such that
for every > 0 there exists an x(t0 ) with kx(t0 )k < , kx(t1 )k
for some t1 > t0 .
(d) completely unstable if there exists an > 0 such that for every
> 0 and for every x(t0 ) with kx(t0 )k < , kx(t1 )k for some
t1 > t0 .
Note : The definition (a) is often called stability in the sense of Lyapunov
(stability i.s.L.) after the Russian mathematician Aleksandr M. Lyapunov (1857-
1918), whose important work features prominently in current control theory.
Examples
Some further aspects of stability are now illustrated through some exam-
ples.
x = 2x x .
2 1 2
C.C. Remsing 119
These equations have a single equilibrium point (state) at the origin. With
arbitrary initial numbers x1 (0) and x2 (0) of rabbits and foxes, respectively,
the solution is
!
t 15 3 15 6x2 (0) t 15
x1 (t) = x1 (0)e 2 cos t + sin t e 2 sin t
2 15 2 15 2
x = (1 2t)x, x(t0 ) = x0 R
2 t 2
|x0 | < et et0 t0 .
2 t 1
Since t 7 et has a minimum value of e 4 when t = 12 , it follows that we
can take
1 2
= e(t0 2 ) .
The condition |x(t)| < is implied by |x0 | < 2 when 2, for then |x(t)| =
|x0 | < 2 < . When < 2, |x(t)| < is implied by |x0 | < , for then
again |x(t)| = |x0 | < . Thus according to the definition, the origin is a stable
equilibrium point (state). However, if x0 > 2 then x(t) , so for initial
perturbations x0 > 2 from equilibrium motions are certainly unstable in a
practical sense.
x = f (t)x, x(0) = x0 R
C.C. Remsing 121
where
ln 10 if 0 t 10
f (t) =
1 if t > 10.
The solution is
10t x0 if 0 t 10
x(t) =
1010 x e10t
if t > 10.
0
Note : Examples 4.1.8 and 4.1.9 show that an equilibrium state may be stable
according to Lyapunovs definitions and yet the systems behaviour may be unsatis-
factory from a practical point of view. The converse situation is also possible, and
this has led to a definition of practical stability being coined for systems which
are unstable in Lyapunovs sense but have an acceptable performance in practice,
namely that for pre-specified deviations from equilibrium the subsequent motions also
lie within specified limits.
x = Ax, x Rm (4.2)
where A Rmm and (4.2) may represent the closed or open loop system.
Provided the matrix A is nonsingular, the only equilibrium state of (4.2) is
the origin, so it is meaningful to refer to the stability of the system (4.2). If
the system is stable (at the origin) but not asymptotically stable we shall call
it neutrally stable.
122 AM3.2 - Linear Control
One of the basic results on which the development of linear system stability
theory relies is now given. The proof will be omitted.
Note : (1) Suppose all the eigenvalues of A have nonpositive real parts. One
can prove that if all eigenvalues having zero real parts are distinct, then the origin is
neutrally stable.
(2) Also, if every eigenvalue of A has a positive real part, then the system is com-
pletely unstable.
det (I2 A) = ( d)
C.C. Remsing 123
Because of its practical importance the linear system stability problem has
attracted attention for a considerable time, an early study being by James C.
Maxwell (1831-1879) in connection with the governing of steam engines.
The original formulation of the problem was not of course in matrix terms,
the system model being
The first solutions giving necessary and sufficient conditions for all the roots of
a() in (4.4) to have negative real parts were given by Augustin L. Cauchy
(1789-1857), Jacques C.F. Sturm (1803-1855), and Charles Hermite
(1822-1901).
4.2.4 Proposition. (The Routh Test) All the roots of the polynomial
a() (with real coefficients) have negative real parts precisely when the given
conditions are met.
Note : The Hurwitz and Routh tests can be useful for determining stability
of (4.3) and (4.2) in certain cases. However, it should be noted that a practical
disadvantage of application to (4.2) is that it is very difficult to calculate accurately
the ai in (4.4). This is important because small errors in the ai can lead to large
errors in the roots of a().
C.C. Remsing 125
4.2.5 Example. Investigate the stability of the linear system whose char-
acteristic equation is
4 + 23 + 92 + 4 + 1 = 0.
So all the roots have negative real parts, and hence the linear system is asymp-
totically stable.
Since we have assumed that the coefficients ai are real it is easy to derive
a simple necessary condition for asymptotic stability :
4.2.6 Proposition. If the coefficients ai in (4.4) are real and a() cor-
responds to an asymptotically stable system, then
ai > 0 , i = 1, 2, . . ., m.
Proof : Any complex root of a() will occur in conjugate pairs i, the
corresponding factor of a() being
( i)( + i) = 2 2 + 2 + 2 .
By Theorem 4.2.1, < 0, and similarly any real factor of a() can be
written ( + ) with > 0. Thus
Y Y
a() = ( + ) (2 2 + 2 + 2 )
and since all the coefficients above are positive, ai must also all be positive.2
Note : Of course the condition above is not a sufficient condition, but it provides a
useful initial check : if any ai are negative or zero, then a() cannot be asymptotically
stable.
126 AM3.2 - Linear Control
are straightforward.
The aim is to determine the stability nature of the equilibrium state (at
the origin) of system without obtaining the solution x(). This of course
has been done algebraically for linear time invariant systems in section 4.2.
The essential idea is to generalize the concept of energy V for a conservative
system in mechanics, where a well-known result states that an equilibrium
point is stable if the energy is minimum. Thus V is a positive function which
has V negative in the neighborhood of a stable equilibrium point. More
generally,
Fm
V V V
= F1 + F2 + + Fm .
x1 x2 xm
A Lyapunov function V for the system (4.7) is said to be
Otherwise, Q is indefinite.
The conclusion of this theorem is plausible, since the values of the strong
Lyapunov function V (x(t)) must continually diminish along each orbit x =
x(t) as t increases (since V is negative definite). This means that the orbit
x = x(t) must cut across level sets V (x) = C with ever smaller values of C.
In fact, limt V (x(t)) = 0, which implies that x(t) 0 as t since
x(t) and V (x) are continuous and V has value zero only at the origin (thats
where the positive definiteness of V comes into play).
For instance,
V = x21 + x22
is radially unbounded, but
x21
V = + x22
1 + x21
is not since, for example,
V 1 as x1 , x2 0.
A similar line of reasoning shows that if is the set of points outside a bounded
region containing the origin, and if throughout , V > 0, V 0 and V is radially
unbounded, then the origin is Lagrange stable.
C.C. Remsing 129
z + kz = 0 (4.8)
x = kx .
2 1
E = kx1 x2 kx2 x1 = 0
and
E = kx1 x2 + kx1 x1 = 0.
So again by Lyapunovs Second Theorem the origin is stable for any non-
linear spring satisfying the above conditions.
130 AM3.2 - Linear Control
4.3.5 Example. Consider now the system of the previous example but
with a damping force dz added, so that the equation of motion is
z + dz + kz = 0. (4.9)
(Equation (4.9) can also be used to describe an LCR series circuit, motion of
a gyroscope, and many other problems.)
Assume first that both d and k are constant, and for simplicity let d =
1, k = 2. The system equations in state space form are
x 1 = x2
x = 2x x
2 1 2
Then
V = 4x21 4x22 .
x = x k(x ) x d(x ).
2 1 1 2 2
So, if E is
x1
1
Z
E = x22 + k( ) d
2 0
then
E = x2 (x1 k dx2 ) + x1 kx 1 = x22 d 0.
z + (z 2 1)z + z = 0 (4.10)
u = z + (1 z 2 )z
132 AM3.2 - Linear Control
z = u.)
x = x (x2 1)x .
2 1 1 2
(The only equilibrium state of this system is the origin.) Try as a potential
Lyapunov function V = x21 + x22 which is obviously positive definite. Then
V = 2x1 x 1 + 2x2 x 2
= 2 x22 (1 x21 ).
Note : You may be tempted to think that the infinite strip S : x21 < 1 is a region
of asymptotic stability. This is not in fact true, since a trajectory starting outside
can move inside the strip whilst continuing in the direction of decreasing V circles,
and hence lead to divergence.
R : V (x) constant
x = x .
3 1
Indeed,
x 1 = z
Z t Z t
= z d (z 2 1)z d
0 0
1 3
= x3 z z
3
1 3
= x3 x x1 .
3 1
Note : In all three Lyapunovs theorems the terms positive and negative can
be interchanged simply by using V instead of V . It is only the relative signs of the
Lyapunov function and its derivative which matter.
x = Ax, x Rm . (4.11)
V = x T P x + xT P x
= xT AT P x + xT P Ax
= xT Qx
C.C. Remsing 135
where
AT P + P A = Q (4.13)
and it is easy to see that Q is also symmetric. If P and Q are both positive
definite, then by Lyapunovs First Theorem the (origin of) system (4.11)
is asymptotically stable. If Q is positive definite and P is negative definite or
indefinite, then in both cases V can take negative values in the neighborhood
of the origin so by Lyapunovs Third Theorem, (4.11) is unstable. We
have therefore proved :
Note : Equations similar in form to (4.13) also arise in other areas of control
theory. However, it must be admitted that since a digital computer will be required
to solve (4.13) except for small values of n, so far as stability determination of (4.11)
is concerned it will be preferable instead to find the eigenvalues of A. The true value
and importance of Proposition 4.3.10 lies in its use as a theoretical tool.
Linearization
F
Here A = DF (0) : = x x=0 Rmm , g(0) = 0 Rm , and the components
of g have power series expansions in x1 , x2 , . . . , xm beginning with terms of
at least second degree. The linear system
x = Ax (4.15)
V = xT P x
where P satisfies
AT P + P A = Q
V = xT Qx + 2g T P x .
Because of the nature of g, the term 2g T P x has degree three at least, and so
for x sufficiently close to the origin, V < 0.
f : R R, t 7 at2 + bt3 .
Show that, for t sufficiently close to the origin (i.e. for |t| < ), we have f(t) < 0.
C.C. Remsing 137
x = F (x), F (0) = 0
z + az + bz + g(z, z)
=0
or
x 1 = x2
x = bx ax g(x , x ).
2 1 2 1 2
The linear part of this system is asymptotically stable if and only if a > 0
and b > 0, so if g is any function of x1 and x2 satisfying the conditions of
Theorem 4.3.11, the origin of the system is also asymptotically stable.
Input-output stability
y = h(t, x, u)
ku(t)k < L1 , t t0
where L1 is any positive constant, then there exists a number L2 > 0 such
that
ky(t)k < L2 , t t0
regardless of initial state x(t0 ). The problem of studying b.i.b.o. stability for
nonlinear systems is a difficult one, but we can give some results for the usual
linear time-invariant system
x = Ax + Bu(t)
(4.16)
y = Cx.
for some constants K, a > 0. (The result remains valid for any stability matrix, but
the proof is more difficult.)
C.C. Remsing 139
x = Ax
Proof : Using
Z t
x(t) = exp(tA) x0 + exp( A)Bu( ) d
0
showing that the output is bounded, since kCk and kBk are positive numbers.2
x = Ax
is asymptotically stable.
140 AM3.2 - Linear Control
Note : For linear time-varying systems Proposition 4.4.2 is not true, unless for
all t the norms of B(t) and C(t) are bounded and the norm of the state transition
matrix (t, t0 ) is bounded and tends to zero as t independently of t0 .
x = Ax + Bu(t)
V = V T = (Ax + Bu)T (V )
= xT AT (V ) + uT B T (V )
= xT Qx + uT B T (V )
AT P + P A = Q
and T
V V V
V : = ...
x1 x2 xm
C.C. Remsing 141
Linear feedback
Consider again the linear system (4.16). If the open loop system is unstable
(for instance, by Theorem 4.2.1, if one or more of the eigenvalues of A has
a positive real part), then an essential practical objective would be to apply
control so as to stabilize the system; that is, to make the closed loop system
asymptotically stable.
If (4.16) is c.c., then we saw (Theorem 3.1.3) that stabilization can always
be achieved by linear feedback u = Kx, since there are an infinity of matrices
K which will make A + BK a stability matrix.
If the pair (A, B) is not c.c., then we can define the weaker property that
(A, B) is stabilizable if (and only if) there exists a constant matrix K such
that A + BK is asymptotically stable.
in Example 3.4.4. The eigenvalues of the uncontrollable part are the roots
of the polynomial
p() = 2 7 23
y = cx
y = cx
Note : It can be proved that the pair (A, B) is c.c. if and only if
h i
rank Im A B = m
for all eigenvalues of A. (The eigenvalues at which the rank drops below m
are the so-called uncontrollable modes.) Clearly, if the pair (A, B) is c.c., then it is
stabilizable.
for all eigenvalues of A. (The eigenvalues at which the rank drops below m
are the so-called unobservable modes.) Clearly, if the pair (A, C) is c.o., then it is
detectable.
x = Ax + Bu
y = Cx
Application
x = Ax
AT P + P A = Q
C.C. Remsing 145
then
V xT Qx
= T
V x Px
xT Qx
where is the minimum value of the ratio xT P x
(in fact, this is equal to the
smallest eigenvalue of QP 1 ). Integrating with respect to t gives
V (x(t)) et V (x(0)).
u = (S Q1 )B T P x (4.17)
AT P + P A = Q
x = A + B(S Q1 )B T P x
(4.18)
in the sense that trajectories will approach the origin more quickly.
146 AM3.2 - Linear Control
4.5 Exercises
Exercise 68 Determine the equilibrium point (other than the origin) of the system
described by
x 1 = x1 2x1 x2
x = 2x + x x .
2 2 1 2
Apply a transformation of coordinates which moves this point to the origin, and
find the new system equations. (The equations are an example of a predator-pray
population model due to Vito Volterra (1860-1940) and used in biology, and are
more general than the simple linear rabbit-fox model.)
is asymptotically stable.
x = A(t)x
diverges as t .
z + z + z 3 = 0
V = x41 x22 .
Hence investigate the stability nature of the equilibrium point at the origin.
x = x x + (x + 2x ) x2 1
2 1 2 1 2 2
is asymptotically stable by considering the region |x2 | < 1. State the region of
asymptotic stability thus determined.
148 AM3.2 - Linear Control
x = 2x x
2 1 2
can be written as Z
P = exp(tAT )Q exp(tA) dt.
0
z + a1 z + a2 z = 0
into the state space form. Using V = xT P x with V = x22 , obtain the necessary
and sufficient conditions a1 > 0, a2 > 0 for asymptotic stability.
C.C. Remsing 149
x = kx 2x
2 1 2
when k = 1. Using the same function V , obtain sufficient conditions on k for the
system to be asymptotically stable.
Exercise 80 Investigate the stability nature of the equilibrium state at the origin
for the system
4
x 1 = 7x1 + 2 sin x2 x2
x = ex1 3x 1 + 5x2 .
2 2 1
(a)
x 1 = x1 + x2
x = (x + x ) sin x 3x .
2 1 2 1 2
(b)
3
x 1 = x1 + x2
x = ax bx ;
a, b > 0.
2 1 2
is stable i.s.L. and b.i.b.o. stable, but not asymptotically stable. [It is easy to verify
that this system is not completely controllable.]
150 AM3.2 - Linear Control
x = x x x2 .
2 1 2 1
Determine the equilibrium state which is not at the origin. Transform the system
equations so that this point is transferred to the origin, and hence verify that this
equilibrium state is unstable.
Exercise 85 Determine for what range of values of the real parameter k the linear
system
0 1 0
x =
0 0 1 x
5 k k6
is asymptotically stable.
Exercise 86 A particle moves in the xy-plane so that its position in any time t is
given by
x + y + 3x = 0, y + x + 3y = 0.
Determine the stability nature of the system if (a) = 4 and (b) = 16.
V = 2x21 + x22
x = 2x .
2 1
C.C. Remsing 151
x = x 2x3 x
2 1 2 2
is asymptotically stable provided the real parameter is negative. Use the Lyapunov
function
1 2
x1 + 3x22
V =
2
to determine a region of asymptotic stability about the origin.
(a) If = 0 show that the system is b.i.b.o. stable for any output which is
linear in the state variables, provided > 1.
(b) If = = 1 and u() = 0, investigate the stability nature of the equi-
librium state at the origin by using the Lyapunov function
(b)
" # " #
0 0 0
x = x+ u(t)
0 1 1
h i
y = 1 1 x.