Lec 18 Unidirectional Search PDF
Lec 18 Unidirectional Search PDF
Lec 18 Unidirectional Search PDF
In this Lecture
In this lecture we will discuss two important methods
that deal with Multi-variable Unconstrained
Optimization:
Multi Dimensional Plots and
Uni-directional Search
Example 4.2
The following function of two variables has distinct local minimum points and
inflection point:
f = (x - 2)2 + (x - 22)2
Let us draw a mesh grid
graph in three dimensions
using MATLAB.
% matlab function for 3D plot
clear all
[X,Y] = meshgrid(-2:.1:2);
Z = ( X - 2.0).^2 + (X - 2.*Y.*Y).^2 ;
mesh(X, Y, Z) %interpolated
axis tight; hold on
plot3(x,y,z,'.','MarkerSize',15)
SURFACE
PLOT using
MATLAB
gradient vector =
f
2 f
2
x
H= 2
f
yx
4
4
x
4
y
=
8 xy + 16 y
2 f
8y
xy 4
=
2
2
f 8 y 8 x + 48 y
y 2
CONTOUR
PLOT using
MATLAB
Next step is to
draw a contour
graph and identify
points of interest.
(Here we see (2, 1)T and (2, 1)T as
important points.
% matlab function for 3D plot
clear all
[X,Y] = meshgrid(-2:.05:5);
Z = ( X - 2.0).^2 + (X - 2.*Y.*Y).^2 ;
contour(X,Y,Z, 3000)
SURFACE
PLOT using
MATLAB
Example 4.2
Optimality condition (first order):
f
= 2( x 2) + 2( x 2 y 2 ) = 4 + 4 x 4 y 2 0
x
f
= 8 y ( x 2 y 2 ) = 8 xy + 16 y 3 0
y
4 0
H =
0 8
Example 4.2
Let us check with second optimality condition for point (2, -1)
4 8
H =
8 32
Let us check with second optimality condition for point (2, +1)
4 8
H =
8 32
Example 4.3
The following function of two variables has only one minimum point
but two distinct local maximum points and two inflection points:
f(x) = 25x2 12x4 6xy + 25y2 24x2y2 12y4.
Example 4.3
Example 4.3
Example 4.3
Object function is: f(x) = 25x2 12x4 6xy + 25y2 24x2y2 12y4.
f
= 50 x 48 x 3 6 y 48 xy 2
x
f
= 6 x + 50 y 48 x 2 y 48 y 3
y
2 f
2
x
H= 2
f
yx
2 f
2
2
xy 50 144 x 48 y
=
2
f
6 96 xy
y 2
2
2
50 48 x 144 y
6 96 xy
Example 4.3
Optimality condition (first order):
f
= 50 x 48 x 3 6 y 48 xy 2 = 0
x
f
= 6 x + 50 y 48 x 2 y 48 y 3 = 0
y
62 50
H =
50 62
Example 4.3
Let us check with second optimality condition for point (-0.677, -0.677)
38 50
H =
50 38
50 6
H =
6 50
Example 4.3
Let us check with second optimality condition for point (+0.677, +0.677)
38 50
H =
50 38
Let us check with second optimality condition for point (0.764, -0.764)
62 50
H =
50 62
Example 4.3
The graph shows
these points on
contour plot as
maximum,
minimum and
inflection points;
Maximum
Inflection
Minimum
Inflection
Maximum
Example 3.2.1
Consider the objective function:
Minimize: f(x, y) = (x 10)2 + (y 10)2
Step 1: First draw the contour plot of this function. The contour
plot has lines where any two points on the line have the same
function value.
Step 2: Let us say that a current point selected is x(t) = (2, 1)T.
and the function value in a search direction s(t) = (2, 5)T from
the current point.
x(a) = x(t) + as(t) ; vector equation
Example 3.2.1
Now we find the equation for the straight line passing
through (2, 1)T and direction (2, 5)T as:
(x 2)/2 = (y 1)/5 ;
y 1 = (5/2)(x 2).
Example 3.3.1
Consider the Himmelblau function:
Minimize:
f(x, y) = (x2 + y 11)2 + (x + y2
7)2
in the interval 0 < x < 5;
Example 3.3.1
Step 1: We now choose an initial
point x(0) = (1, 1)T
Size reduction parameter = (2, 2)T
Also choose e = 0.001
Step 2: We create a two
dimensional hypercube (a square)
around x(0) such that
X(1) = (0, 0)T , x(2) = (2, 0)T ,
X(3) = (0, 2)T , x(4) = (2, 2)T
Step 3: The function values at these
points are:
f(0) = 106 ; f(1) = 170 ; f(2) = 74
f(3) = 90 ; f(4) = 26
The minimum is at (2, 2)T. Let us say this as x* =(2, 2)T
Example 3.3.1
Step 4: Since x* is not x(0), we
now set x(0) = (2, 2)T and got to
step 2.
Step 2: We create a new square
around this x(0) with = 1 such
that
X(1) = (1, 1)T , x(2) = (3, 1)T ,
X(3) = (1, 3)T , x(4) = (3, 3)T
Step 3: The function values at
these points are:
f(0) = 26 ; f(1) = 106 ; f(2) = 10
f(3) = 58 ; f(4) = 26
Example 3.3.1
Step 4: Since x* is not x(0), we
now set x(0) = (3, 1)T and go to
step 2.
Step 2: We create a new square
around this x(0) with = 1 such
that
X(1) = (2, 0)T , x(2) = (4, 0)T ,
X(3) = (2, 2)T , x(4) = (4, 2)T
Step 3: The function values at
these points are:
f(0) = 10 ; f(1) = 74 ; f(2) = 34
f(3) = 26 ; f(4) = 50
Example 3.3.1
Step 4: Since x* is same as x(0),
we now set x(0) = (3, 1)T and go
to step 2 after reducing step size =
0.5
Step 2: We create a new square
around this x(0)=(3, 1)T with = 0.5
such that
X(1) = (2.5, 0.5)T , x(2) = (3.5, 0.5)T
,
X(3) = (2.5, 1.5)T , x(4) = (3.5, 1.5)T
Step 3: The function values at
these points are computed and
minimum is at x(4) = (3.5, 1.5)T
Example 3.3.1
Step 4: Since x* is not same as
x(0), we now set x(0) = (3.5, 1.5)T
and go to step 2 with step size =
0.5
Step 2: We create a new square
around this x(0)=(3, 1)T with = 0.5
such that
X(1) = (3, 1)T , x(2) = (4, 1)T ,
X(3) = (3, 2)T , x(4) = (4, 2)T
Step 3: The function values at
these points are
f(0) =9.125; f(1) = 10 ;
Example 3.3.1
It is important to see that although
we have found a minimum point but
the algorithm does not terminate at
this step. We may continue.
It is clear that the convergence
depends upon the initial cube size
and location and the chosen size
reduction parameter.
Small size may lead to premature
convergence and large may never
lead to convergence.
f
f
= 2 y + 2 2x
= 2x 4 y
x
y
f
1
,
1
)
x
2(1) + 2 2(1) 6
At (1,1), f = f
=
=
f ( x) = 2 xy + 2 x x 2 2 y 2
g (h) = f (-1 + 6h,1 6h) = ... = 7 + 72h 180h 2
Setting g ' (h) = 0 yields
72 360h = 0 h = 0.2
If h = 0.2 maximizes g(h), then x = -1+6(0.2) = 0.2 and y = 16(0.2) = -0.2 would maximize f(x, y).
So moving along the direction of gradient from point (-1,
1), we would reach the optimum point (which is our next
point) at (0.2, -0.2).