Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Least-Squares Methods For System Identification

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

9/18/2001

Soft Computing : Least-Squares Estimators

Soft Computing : Least-Squares Estimators

System Identification
The problem of determining a mathematical
model for an unknown system by observing its
input-output data pairs is generally referred to
as system identification

Least-Squares Methods for


System Identification
(Chapter 5)

The purposes of system identification are


to predict a systems behavior,
to explain the interactions and relationships
between inputs and outputs, and
to design a controller or simulation of the
system

Bill Cheetham, Kai Goebel


GE Corporate Research & Development
cheetham@cs
cheetham@cs..rpi.
rpi.edu
goebel@
goebel@cs.
cs.rpi.
rpi.edu
1

Soft Computing : Least-Squares Estimators

Soft Computing : Least-Squares Estimators

Why cover System Identification

Linear Regression

It is a well established and easy to use


technique for modeling a real life system.

Statistical method of fitting data to a linear


model (also called Least Squares Estimator)
+3
+2
+1
y 0
-1
-2
-3

It will be needed for the section on fuzzy-neural


networks.
There will be a homework assignment that
covers it.

x
y=x-3
3

Soft Computing : Least-Squares Estimators

Soft Computing : Least-Squares Estimators

Spring Data

Components of System Identification


Step1: Structure Identification

Experiment
1
2

Force(newtons)
1.1
1.9

Length(inches)
1.5
2.1

3
4
5

3.2
4.4
5.9

2.5
3.3
4.1

6
7

7.4
9.2

4.6
5.0

Determine the class of models within which the search for the
most suitable model is conducted. y = f(u; ) where u is the input
vector and is the parameter vector.
Example:
Linear
y= 0 + 1u1,
Second-order polynomial

1u 1

2u1

Step 2: Parameter Identification


Apply optimization technique to determine parameters
Example:
Linear model where
0
1

y=

What will the length be when the force is 5.0


newtons?

= -3
=1

Page 1

9/18/2001

Soft Computing : Least-Squares Estimators

Soft Computing : Least-Squares Estimators

Parameter Identification
training
data

Target System
to be Identified

System Identification Process

yi
yi - yi

Mathematical Model yi
f(u; )

Specify class of models representing the system

Identification
Techniques

Optimize parameters

Select another model

feedback
Conduct validation tests (Are they satisfactory?)

Training data is used for both system and model.

no

yes
Done

Structure and parameter identification may


need to be done repeatedly

Difference between Target Systems output, yi,


and Mathematical Model output, yi, is used to
update parameter vector, .
7

Soft Computing : Least-Squares Estimators

Soft Computing : Least-Squares Estimators

Least-squares parameter optimization

Least-squares: Error

We will restrict the least-squares discussion


to:

Least squared heavily weights error for data


points that are far from the expected value
+3
+2
+1
0
-1
-2
-3

linear models
models that have linear parameters
y=

1u1

2u 1

static (memory-less) systems

Expected

output depends on current inputs only.

Error 0

+1

-2

+3

output does not depend on history.

Error2 0

Sum of Squared Error = (0 + 0 + 1 + 0 + 4 + 9 + 0) = 14

10

Soft Computing : Least-Squares Estimators

Soft Computing : Least-Squares Estimators

Least-squares: Matrix method


To uniquely identify the unknown vector
necessary that


Least Squares: Matrix method (2)

it is

Example: Find a line from two points (1.1, 1.5),


(1.9, 2.1)

m (# of training items) >= n (# of parameters)


training data is {(ui;yi), i = 1, , m}
If m = n , then we can solve for using
= A-1 y

A=

1 1.1
1 1.9

y=

1
2

1.5
2.1

A-1 =

2.375
-1.25

-1.375
1.25

-1.375
1.25

1.5
2.1

If m = n = 2 where f1(u) = x0 and f2(u) = x1 then


f1(u1)

f2(u1)

A=

u1

=
f1(u2)

f2(u2)

= A-1 y =

y1
y=

u2

2.375
-1.25

.675
.75

y2

11

12

y = .675 + .75x

Page 2

9/18/2001

Soft Computing : Least-Squares Estimators

Soft Computing : Least-Squares Estimators

Least-squares: Matlab

Least-squared: m > n

>>A = [1 1.1;1 1.9]


ans = 1
1

When m > n there are more data pairs than


fitting parameters.

1.1
1.9

An exact solution, satisfying all m equations, is


not always possible.

>>y = [1.5 ; 2.1]

In order to handle this we need to incorporate


an error vector.


ans = 1.5
2.1

13

+e=y

>>inv(A) * y

f1(u1)

fn(u1)

ans = 0.675
0.750

f1(um)

fn(um)

e1

+
n

y1
=

em

ym

14

Soft Computing : Least-Squares Estimators

Soft Computing : Least-Squares Estimators

Least-squared: m > n (2)

Spring Example

The best set of parameters  is the one that


minimizes the sum of the squared values of e.

Structure Identification can be done using


domain knowledge.

Theorem 5.1 states that the error is minimized


when

(AT

A)-1

The change in length of a spring is


proportional to the force applied.

ATy

- Hookes law

Proof on page 106 of Jang

length = k0 + k1 force

15

16

Soft Computing : Least-Squares Estimators

Soft Computing : Least-Squares Estimators

Spring Example

Example: Spring data plot



+e=y
1
1
1
1
1
1
1

k
k1

1.1
1.9
3.2
4.4
5.9
7.4
9.2

k0
k1

e1
e2
e3
e4
e5
e6
e7

= (AT A)-1 ATy =

1.5
2.1
2.5
= 3.3
4.1
4.6
5.0

1.20
0.44

When force = 5, length = 1.2 + 5 * .44 = 3.4

17

18

Page 3

9/18/2001

Soft Computing : Least-Squares Estimators

Soft Computing : Least-Squares Estimators

Regression is easy...

or maybe it isnt?

Exhaust gas temperature sampled every flight


(a typical day has about 5 flights).

Same regression performed for the last 10


points

Regression over 10 points:

6 0

5 5
5 0

5 0

4 5
4 0

4 0

3 5
3 0

3 0
2 5

2 0

2 0
1 5

1 0

1 0
5

0
0

20 0

4 00

60 0

8 00

1 00 0

12 00

1 40 0

16 0 0

2 0 0

4 0 0

6 0 0

8 0 0

1 0 0 0

1 2 0 0

1 4 0 0

1 6 0 0

1 8 0 0

1 80 0

19

20

Soft Computing : Least-Squares Estimators

Soft Computing : Least-Squares Estimators

Memory models

Memory models

output depends on history.


f(t) = Df(t - 1) + (1 - D)x(t)

N day running average (GE stock)

+3
+2
+1
0
-1
-2
-3

Expected

Yellow line is smoothed with

= .5

21

22

Page 4

You might also like