( (x, y · · ·, n), and given a functional β) β β) "models" the data
( (x, y · · ·, n), and given a functional β) β β) "models" the data
Interpolation Approximation
⃗ = yj
f (xj ; β) ⃗ + εj = yj
f (xj ; β)
εj is “noise”, E(εj ) = 0
1/19
Least squares approximation
E(εj ) = 0
E(ε2j ) = σj2
σj ≈ const
n
∑ 2
⃗ = ⃗ ⇒ min
2
χ (β) yj − f (xj ; β)
j=1
2/19
Weighted least squares
E(εj ) = 0
E(ε2j ) = σj2
3/19
Least absolute deviations
4/19
Total least squares
6/19
Least squares approximation
E(εj ) = 0
E(ε2j ) = σj2
σj ≈ const
n
∑ 2
⃗ = ⃗ ⇒ min
2
χ (β) yj − f (xj ; β)
j=1
7/19
Linear least squares
⃗ a linear combination
Let the model, f (x; β), is a linear function of β,
of m basis functions, φk (x)
∑
m
⃗ =
f (x; β) βk φk (x)
j=1
8/19
Linear least squares
⃗ a linear combination
Let the model, f (x; β), is a linear function of β,
of m basis functions, φk (x)
∑
m
⃗ =
f (x; β) βk φk (x)
j=1
9/19
Linear least squares
∑
n
⃗ =
ξ(β) |zj |2
j=1
where (j = 1, . . . , n)
( )
zj = yj − β1 φ1 (xj ) + β2 φ2 (xj ) + · · · + βm φm (xj )
Which is eqivalent to
2
ξ(β) =
y − Aβ⃗
2
10/19
Design matrix
11/19
Design matrix
11/19
Example: straight line t
The model is
⃗ = β1 + β2 x
f (x; β)
12/19
Linear least squares
Normal equations
13/19
Linear least squares: normal equations
14/19
Linear least squares: normal equations
Normal equations
⃗ = AT y
AT A β
give a formal solution of a linear least squares problem.
However, ( )
cond AT A = [cond A]2
so that typically the system of normal equations is very poorly
conditioned.
15/19
Linear least squares
16/19
Linear least squares: QR factorization
A = QR
Since a design matrix is thin and tall (m < n), last n − m rows of R
are zero:
[ ]
R1
A=Q
0
where dim R1 = m
17/19
Linear least squares: QR factorization
[ ]
2
T R1 ⃗
=
Q y − β
0
Next, write [ ]
T f
Q y=
r
with dim f = m and dim r = n − m.
18/19
Linear least squares: QR factorization
This way,
2
⃗ =
ξ(β)
f − R1 β⃗
+ ∥r∥
2
⃗ satis es
And the minimum of ξ(β)
R1 β⃗ = f
19/19
Linear least squares: QR factorization
This way,
2
⃗ =
ξ(β)
f − R1 β⃗
+ ∥r∥
2
⃗ satis es
And the minimum of ξ(β)
R1 β⃗ = f
Algorithm
▶ Factorize the design matrix A = QR
▶ Rotate y → QT y (only need m rows ⇒ thin QR)
▶ Solve R1 β⃗ = f by back substitution.
19/19