1 Ot 16092022
1 Ot 16092022
1 Ot 16092022
Optimization Books
Textbooks:
(i) INTRODUCTION:
•Definition of Optimization problems and techniques.
•Mathematical Models.
•Local and global Extrema (Optima) of a function of one and more
than one variables and inflexion points.
•Types of optimization techniques.
•Derivation of Necessary and Sufficient conditions for an extremum of
a function of one and more than one variables.
•Langrange’s Multipliers techniques.
3
(ii) UNCONSTRAINED OPTIMIZATION FOR FUNCTIONS:
• Gradient of a function.
• Quadratic forms of a function.
• Hessian matrix.
• Positive and negative definite matrices, Indefinite matrices.
• Steepest-Descent method.
• Newton’s Method.
• Convergence criteria.
• Variable metric method- avidon-Fletcher –powell Method)
4
(b) DIRECT SEARCH METHODS:
•Unimodal function.
•Simplex Method of Nelder & Mead.
•Method of Hook & Jeaves Fibonacci Method.
•Quadratic Interpolation Powell’s method.
(c) UNIVARIATE SEARCH AND POWELL’S METHOD:
5
1. Introduction
• Optimization is the act of obtaining the best result under given
circumstances.
6
1. Introduction
• Operations research (in the UK) or operational research (OR)
(in the US) or is an interdisciplinary branch of mathematics which
uses methods like:
– mathematical modeling
– statistics
– algorithms to arrive at optimal or good decisions in complex
problems which are concerned with optimizing the maxima (profit,
faster assembly line, greater crop yield, higher bandwidth, etc) or
minima (cost loss, lowering of risk, etc) of some objective function.
7
1. Introduction
8
1. Introduction
Historical development
9
1. Introduction
Historical development
10
1. Introduction
Historical development
11
1. Introduction
Historical development
12
1. Introduction
• Mathematical optimization problem:
• f0 : Rn R: objective function
• x=(x1,…..,xn): design variables (unknowns of the problem,
they must be linearly independent)
• gi : Rn R: (i=1,…,m): inequality constraints
13
1. Introduction
• If a point x* corresponds to the minimum value of the function f (x), the
same point also corresponds to the maximum value of the negative of
the function, -f (x). Thus optimization can be taken to mean
minimization since the maximum of a function can be found by seeking
the minimum of the negative of the same function.
14
1. Introduction
Constraints
15
1. Introduction
Constraint Surface
• For illustration purposes, consider an optimization problem with only
inequality constraints gj (X) ≤ 0. The set of values of X that satisfy
the equation gj (X) =0 forms a hypersurface in the design space and
is called a constraint surface.
16
1. Introduction
Constraint Surface
• Note that this is a (n-1) dimensional subspace, where n is the
number of design variables. The constraint surface divides the
design space into two regions: one in which gj (X) ≤ 0 and the other
in which gj (X) ≥0.
17
1. Introduction
Constraint Surface
• Thus the points lying on the hypersurface will satisfy the constraint
gj (X) critically whereas the points lying in the region where gj (X) >0
are infeasible or unacceptable, and the points lying in the region
where gj (X) < 0 are feasible or acceptable.
18
1. Introduction
Constraint Surface
• In the below figure, a hypothetical two dimensional design space is
depicted where the infeasible region is indicated by hatched lines. A
design point that lies on one or more than one constraint surface is
called a bound point, and the associated constraint is called an
active constraint.
19
1. Introduction
Constraint Surface
• Design points that do not lie on any constraint surface are known as
free points.
20
1. Introduction
Constraint Surface
Depending on whether a
particular design point belongs to
the acceptable or unacceptable
regions, it can be identified as one
of the following four types:
21
1. Introduction
• The conventional design procedures aim at finding an acceptable or
adequate design which merely satisfies the functional and other
requirements of the problem.
• In general, there will be more than one acceptable design, and the
purpose of optimization is to choose the best one of the many
acceptable designs available.
22
1. Introduction
• In civil engineering, the objective is usually taken as the
minimization of the cost.
24
1. Introduction
• The locus of all points satisfying f (X) = c = constant forms a
hypersurface in the design space, and for each value of c there
corresponds a different member of a family of surfaces. These surfaces,
called objective function surfaces, are shown in a hypothetical
two-dimensional design space in the figure below.
25
1. Introduction
• Once the objective function surfaces are drawn along with the constraint
surfaces, the optimum point can be determined without much difficulty.
• But the main problem is that as the number of design variables exceeds
two or three, the constraint and objective function surfaces become
complex even for visualization and the problem has to be solved purely
as a mathematical problem.
26
Examples
27
Classification of optimization problems
• Constraints
– Constrained optimization problem
– Unconstrained optimization problem
28
Classification of optimization problems
29
Classification of optimization problems
30
Classification of optimization problems
31
Multiobjective Programming
Problem
• A multiobjective programming problem can be stated as follows:
32
Review of mathematics
Concepts from linear algebra:
Positive definiteness
• Test 1: A matrix A will be positive definite if all its
eigenvalues are positive; that is, all the values of λ that satisfy
the determinental equation
33
Review of mathematics
Positive definiteness
• Test 2: Another test that can be used to find the positive definiteness
of a matrix A of order n involves evaluation of the determinants
• The matrix A will be positive definite if and only if all the values A1,
A2, A3,…An are positive
• The matrix A will be negative definite if and only if the sign of Aj is
(-1)j for j=1,2,…,n
• If some of the Aj are positive and the remaining Aj are zero, the matrix
A will be positive semidefinite
34
Review of mathematics
Negative definiteness
35
Review of mathematics
Concepts from linear algebra:
36
Review of mathematics
Solutions of a linear problem
Minimize f(x)=cTx
Subject to g(x): Ax=b
Side constraints: x ≥0
• The existence of a solution to this problem depends on the
rows of A.
The new matrix A* is called the augmented matrix- the columns of b are added
to A. According to the theorems of linear algebra:
• If the augmented matrix A* and the matrix of coefficients A have the same rank
r which is less than the number of design variables n: (r < n), then there are
many solutions.
• If the augmented matrix A* and the matrix of coefficients A do not have the
same rank, a solution does not exist.
• If the augmented matrix A* and the matrix of coefficients A have the same rank
r=n, where the number of constraints is equal to the number of design variables,
then there is a unique solution.
38
Review of mathematics
In the example
39
Review of mathematics
In the example
In this example, the degree of freedom is 1 (i.e., 3-2). For instance x3 can
be assigned a value of 1 in which case x1=0.5 and x2=1.5
40
Homework
What is the solution of the system given below?
Hint: Determine the rank of the matrix of the coefficients and
the augmented matrix.
41
2. Classical optimization techniques
Single variable optimization
• Since some of the practical problems involve objective functions that are
not continuous and/or differentiable, the classical optimization techniques
have limited scope in practical applications.
42
2. Classicial optimization techniques
Single variable optimization
43
2. Classicial optimization techniques
Single variable optimization
• A function f (x) is said to have a global or absolute minimum
at x* if f (x*) ≤ f (x) for all x, and not just for all x close to
x*, in the domain over which f (x) is defined.
44
Necessary condition
depending on whether h
approaches zero through positive
or negative values, respectively.
Unless the numbers or are
equal, the derivative f’ (x*) does
not exist. If f’ (x*) does not exist,
the theorem is not applicable.
46
Sufficient condition
47
Example
Determine the maximum and minimum values of the function:
Solution cont’d:
At x=0, f’’(x)=0 and hence we must investigate the next derivative.
49