0% found this document useful (0 votes)
13 views6 pages

16_299_Nonlinear_Stability_Tests-2

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views6 pages

16_299_Nonlinear_Stability_Tests-2

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

16-299 Spring 2021:

Nonlinear Stability Tests

George Kantor

3 May 2021

1 Equilibrium Points/Stability

recall that a point xe is an equilibrium point if ẋ is zero at xe , in other words


f (xe , 0) = 0

There is an equilibrium point at


 
0
x=
0
which corresponds to the pendulum hanging straight down.
There is an equilibrium point at
 
π
x=
0
which corresponds to the pendulum standing straight up.
There are actually an infinite number of points at
 

x=
0
for any n.
stability: From experience, we know that the equilibrium point at the top is unstable and the equilibrium point at the
bottom is stable. If I add a little friction the equilibrium point at the bottom becomes asymptotically stable.
Let’s quickly revisit our stability definitions. The equilibrium point xe is said to be
stable: if for every  > 0 there exists δ() > 0 so that if kx(0)k < δ then kx(t) − xe k <  for all t > 0. In words, “if
you start close, you stay close”.
asymptotically stable: if there exists an open D ⊂ Rn with xe ∈ D so that x(t) → xe as t → ∞ for all initial
conditions x0 in D. (alternatively, we could say that there exists a δ > 0 so that if kx(0)k < δ then x(t) → xe as
t → ∞.) In words, “if you start close, you converge.”
unstable: if it is not stable. In words: “There are places arbitrarily close to the equilibrium point such that, if you
start there, you are driven away from it.”
Like before, we’d like a test to determine if the system is stable without having to integrate the nonlinear ODE. We’ll
learn two tests in this class, here is the first one:
2 Stability Test #1

Theorem: nonlinear stability test #1


(also known as “Lyapunov’s First Method” or “Lyapunov’s Indirect Method”)
Let ẋ = f (x, u) be a nonlinear system with an equilibium point at xe and let
ż = Az + Bu
be the linearization about xe . Let Λ = {λ1 , λ2 , . . . , λn } be the eigenvalues of A. Then

1. the system is asymptotically stable if Re(λi ) < 0 for all i = 1, 2, . . . , n.


2. the system is unstable if Re(λi ) > 0 for some i.

Note that if all of the eigenvalues are ≤ 0 then the test in inconclusive.
pendulum example: bottom eq. pt.:
 
0 1
A= g
−` 0
and the eigenvalues of A are
  
λ −1
Λ = roots det g
` λ+γ
 g
= roots λ2 + λγ +
`
−γ ± γ 2 − 4 g`
p
λ=
2
Note that for positive g and `,
r
g
γ2 − 4 < γ
l
so the real part of λ will have the opposite sign as γ. If γ > 0, then the system is stable. If γ < 0 (i.e., negative
damping) then the system is unstable.
Note that in the case where γ = 0, we get
r
g
λ = ±i .
`
Both eigenvalues have zero real part, so test in inconclusive.
Now look at top equilibrium point:
 
0 1
A= g
` −gamma
and let’s consider the case when γ = 0 to keep the math simple. Then
  
λ −1
Λ = roots det
− g` λ
 g 
= roots λ2 −
`
r
g

`
Assuming g > 0 and ` > 0, then one of these must be positive, so the system is unstable.

2
3 Stability Test #2

Recall that stability test #1 (linearize and check eigenvalues) does not always work, so we introduce a second way
to test nonlinear systems for stability. This method is called “Lyapunov’s Second Method” or “Lyapunov’s Direct
Method” or just “Lyapunov’s Method”. First, we assume we have an unforced nonlinear state equation with and
equilibrium point at zero:

ẋ = f (x), f (0) = 0.

Lyapunov Function Candidate: A Lyapunov function candidate is any differentiable function V : Rn → R that
satisfies

1. V (0) = 0.
2. V (x) > 0 for all x ∈ D, x 6= 0

where D is an open subset of Rn that contains the point x = 0. We often call such a function a “Lyapunov candidate
on D”. Note that D can be all of Rn . In plain english, near x = 0, a lyapunov function candidate is a “bowl shaped”
function with a unique minimum at x = 0.
Time Derivative of V (x): We can use the chain rule to calculate how V changes with time:

dV (x) ∂V ∂V
= V̇ (x) = ẋ = f (x),
dt ∂x ∂x
which can be written more explicitly as
 
f1 (x)
 ∂V ∂V
  f2 (x) 
∂V 
V̇ = ···

∂x1 ∂x2 ∂xn  . 
 .. 
fn (x)

which can also be written as


n
X ∂V
V̇ = fi (x)
i=1
∂xi

Theorem: Stability Test #2 (aka Lyapunov’s Direct or Lyapunov’s Second Method)


Let V : Rn → R be a Lyapunov function candidate on D ⊂ Rn , where D is an open set that contains the point x = 0.
The following are true:

• if V̇ (x) ≤ 0 for all x ∈ D then x = 0 is a stable equilibrium point.


• if V̇ (x) < 0 for all x ∈ D, x 6= 0, then x = 0 is an asymptotically stable equilibrium point.

Proof (stability):
(I usually spare you the proofs, but this one is particularly intuitive, so I think it is worth going over. The math may
look scary, but it keep the following picture in mind as you follow along)

3
Start by recalling the definition of stability: for every  > 0 there exists a δ > 0 so that if kx(0)k < δ then kx(t)k < 
for all t¿0. In order to prove stability, we need to find a δ and show that if x starts with δ of zero, then it stays within 
of zero (for any epsilon). Here we go:
Assume the conditions for stability from the theorem are met (i.e., V is a lyapunov fn. candidate, V̇ ≤ 0)
let  > 0.
choose r > 0, r ≤  so that Br ⊂ D, where

Br = {x ∈ Rn | kxk < r}

In English, choose r so that a ball of radius r is completely contained in D.


let

α = min V x
kxk=r

we know that α > 0 since V (x) > 0 everywhere in D.


choose β < α and define

Ωβ = {x ∈ Br | V (x) < β}

Note that if x starts in Ωβ , then it stays within Ωβ for all time. (Here’s why this is true: if x starts in Ωβ then we know
V (x(0)) < β. And V̇ (x) ≤ 0 so V can never grow, which means it never gets bigger than β, which means that x stays
in Ωβ .)
Now let

δ= min kxk
{x|V (x)=β}

with this choice of δ we see that

kx(0)k < δ ⇒ V (x(0)) < β

4
⇒ x(t) never leaves Ωβ
⇒ x(t) never leaves Br
⇒ kx(t)k < r ≤  for all t > 0
so the equilibrium point x = 0 is stable (and we’re done with the proof).
Asymptotic Stability: the asymptotic stability part of the proof is slightly more complicated and we will not cover it
in this class. The basic idea though is that when V̇ is strictly less than zero, then V is always decreasing, which means
that x has to stay within a set that looks like the Ωβ above but shrinks with time. The crux of the proof is to show that
V (x(t)) → 0 instead of asymptotically converging to some nonzero value.
Example: Pendulum Revisited
Recall from last time that the first stability test failed to produce a result for the case of the “straight down” equilibrium
point because the eigenvalues of the linearization were both on the imaginary axis. We know intuitively that this
equilibrium point should be stable, now we will try to prove it using stabilty test #2.
Recall that the unforced equations of motion are
 
x2
ẋ =
− g` sin x1

where x = [ θ, θ̇ ]T .

Energy is a natural choice for a Lyapunov candidate function – we know that it is non-negative, and we can define the
potential energy so that the energy at the bottom equilibrium point is zero. The kinetic energy for the pendulum is
1  2
K = m `θ̇
2
and the potential energy is
P = mg(` − ` cos θ)
So we can define a Lyapunov function candidate:
1 2 2
V (x) = K + P = m` x2 + mg` − mg` cos x1
2
Computing V̇ :
 
 ∂V ∂V
 f1 (x)
V̇ = ∂x1 ∂x2 f2 (x)

5
 
x2
m`2 x2
 
= mg` sin x1
− gl sin x1
=0
So V̇ ≤ 0, which means we can conclude that the bottom equilibrium point is stable!

4 Nonlinear Stability Summary

We have two tests for stability:

test #1: linearize and look at eigenvalues of resulting A matrix:


• Re(λi ) < 0 for all i ⇒ asymptotically stable
• Re(λi ) > 0 for some i ⇒ unstable
Note that you can never conclude stability from test #1.

test #2: find Lyapunov function V (x):


• V̇ (x) < 0 for all x in a neighborhood of x = 0 ⇒ asymptotically stable
• V̇ (x) ≤ 0 for all x in a neighborhood of x = 0 ⇒ stable

Note that you can never conclude instability from test #2.

You might also like