Introduction To Optimization
Introduction To Optimization
Nildem Tayşi
University of Gaziantep
Optimization Methods
Textbook:
Textbook: S.S. Rao, Engineering optimization (Theory and practice),
Willey, 1996.
References:
Arora, J.S., Introduction to Optimum Design, Second Edition, Elsevier
Academic Press, San Diego, CA, 2004.
Bhatti, M. A., Practical Optimization Methods with Mathematica
Applications, Springer-Verlag, 2000.
Belegundu, A.D. and Chandrupatla, T.R., Optimization Concepts and
Applications in Engineering, Prentice Hall, New Jersey, 1999.
http://nptel.iitm.ac.in/courses/Webcourse-contents/IISc-
BANG/OPTIMIZATION%20METHODS/New_index1.html
Optimization Methods
Goals:
Goal of the course is to teach students the basic concepts of
optimization of engineering systems. Formulation of a problem as
an optimization problem is covered. Basic concepts of
optimization and optimality conditions are covered in class with
simple examples. Simplex method for linear optimization
problems is covered. Post-optimality analysis is discussed.
Numerical methods for solving unconstrained and constrained
optimization problems are presented and illustrated. Throughout
the course, students work on practical design optimization
projects.
Optimization Methods
Learning Objectives:
1. Introduction to the process of designing new systems or
improving existing systems.
2. Formulation of a problem as an optimization problem –
examples.
3. Graphical solution of optimization problems to illustrate basic
concepts.
4. Basic principles of optimum design with application to simple
problems: Optimality conditions.
5. Optimization of linear systems: Linear programming.
6. Optimization of nonlinear systems: Nonlinear programming.
7. Solution of optimization problems using programs.
Optimization Methods
Introduction
Optimization defined as the process of finding the conditions that
give the minimum or maximum value of a function, where the
function represents the effort required or the desired benefit.
Optimization : The act of obtaining the best result under the
given circumstances.
Design, construction and maintenance of engineering systems
involve decision making both at the managerial and the
technological level
Goals of such decisions :
to minimize the effort required or
to maximize the desired benefit
Optimization Methods
Historical Development
Existence of optimization methods can be traced to the days of
Newton, Lagrange, and Cauchy.
Optimization Methods
Historical Development (contd.)
Optimization Methods
Recent History
Optimization Methods
Milestones
Development of the simplex method by Danzig in 1947 for linear
programming problems.
Optimization Methods
Milestones (contd.)
The contributions of Zoutendijk and Rosen to nonlinear programming
during the early 1960s
Optimization Methods
Milestones (contd.)
The desire to optimize more than one objective or a goal while
satisfying the physical limitations led to the development of multi-
objective programming methods; Ex. Goal programming.
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Optimization Methods
Engineering applications of optimization.
Design of structural units in construction, machinery, and in space
vehicles.
Maximizing benefit/minimizing product costs in various
manufacturing and construction processes.
Optimal path finding in road networks/freight handling processes.
Optimal production planning, controlling and scheduling.
Optimal Allocation of resources or services among several activities to
maximize the benefit.
Optimization Methods
Design of civil engineering structures such as frames, foundations,
bridges, towers, chimneys and dams for minimum cost.
Design of minimum weight structures for earth quake, wind and other
types of random loading.
Optimal plastic design of frame structures (e.g., to determine the
ultimate moment capacity for minimum weight of the frame).
Design of water resources systems for obtaining maximum benefit.
Design of optimum pipeline networks for process industry.
Design of aircraft and aerospace structure for minimum weight
Finding the optimal trajectories of space vehicles.
Optimum design of linkages, cams, gears, machine tools, and other
mechanical components.
Selection of machining conditions in metal-cutting processes for
minimizing the product cost.
Design of material handling equipment such as conveyors, trucks and
cranes for minimizing cost.
Optimization Methods
Design of pumps, turbines and heat transfer equipment for maximum efficiency.
•
Optimization Methods
Art of Modeling : Model Building
Development of an optimization model can be divided
into five major phases.
Collection of data
Problem definition and formulation
Model development
Model validation and evaluation or performance
Model application and interpretation of results
Optimization Methods
Data collection
Data collection
may be time consuming but is the fundamental basis of
the model-building process
extremely important phase of the model-building
process
the availability and accuracy of data can have
considerable effect on the accuracy of the model and on
the ability to evaluate the model.
Optimization Methods
Problem Definition
Problem definition and formulation, steps involved:
identification of the decision variables;
formulation of the model objective(s);
the formulation of the model constraints.
In performing these steps one must consider the following.
Identify the important elements that the problem
consists of.
Determine the number of independent variables, the
number of equations required to describe the system,
and the number of unknown parameters.
Evaluate the structure and complexity of the model
Select the degree of accuracy required of the model
Optimization Methods
Model development
Model development includes:
the mathematical description,
parameter estimation,
input development, and
software development
The model development phase is an iterative process
that may require returning to the model definition and
formulation phase.
Optimization Methods
Model Validation and Evaluation
This phase is checking the model as a whole.
Model validation consists of validation of the assumptions and parameters of
the model.
The performance of the model is to be evaluated using standard performance
measures such as Root mean squared error and R2 value.
Sensitivity analysis to test the model inputs and parameters.
This phase also is an iterative process and may require returning to the model
definition and formulation phase.
One important aspect of this process is that in most cases data used in the
formulation process should be different from that used in validation.
Optimization Methods
Modeling Techniques
Different modeling techniques are developed to meet the
requirement of different type of optimization problems. Major
categories of modeling approaches are:
classical optimization techniques,
linear programming,
nonlinear programming,
geometric programming,
dynamic programming,
integer programming,
stochastic programming,
evolutionary algorithms, etc.
These approaches will be discussed in the subsequent modules.
Optimization Methods
Basic components of an
optimization problem
An objective function expresses the main aim of the model
which is either to be minimized or maximized.
A set of unknowns or variables which control the value of
the objective function.
A set of constraints that allow the unknowns to take on
certain values but exclude others.
Optimization Methods
Objective Function
As already defined the objective function is the mathematical function one wants to
maximize or minimize, subject to certain constraints. Many optimization problems
have a single objective function (When they don't they can often be reformulated so
that they do). The two interesting exceptions are:
No objective function. The user does not particularly want to optimize anything so there is
no reason to define an objective function. Usually called a feasibility problem.
Optimization Methods
Statement of an optimization problem
x1
To find X = x2 which maximizes f(X)
.
.
Subject to the constraints x
n
gi(X) <= 0 , i = 1, 2,….,m
lj(X) = 0 , j = 1, 2,….,p
Optimization Methods
Statement of an optimization problem
where
X is an n-dimensional vector called the design vector
f(X) is called the objective function, and
gi(X) and lj(X) are known as inequality and equality constraints,
respectively.
This type of problem is called a constrained optimization problem.
Optimization problems can be defined without any constraints as
well. Such problems are called unconstrained optimization
problems.
Optimization Methods
Objective Function Surface
If the locus of all points satisfying f(X) = a constant c is considered, it
can form a family of surfaces in the design space called the objective
function surfaces.
When drawn with the constraint surfaces as shown in the figure we can
identify the optimum point (maxima).
This is possible graphically only when the number of design variable is
two.
When we have three or more design variables because of complexity in
the objective function surface we have to solve the problem as a
mathematical problem and this visualization is not possible.
Optimization Methods
Objective function surfaces to find the
optimum point (maxima)
Optimization Methods
Stationary points
For a continuous and differentiable function f(x) a
stationary point x* is a point at which the function
vanishes, i.e. f ’(x) = 0 at x = x*. x* belongs to its domain
of definition.
A stationary point may be a minimum, maximum or an
inflection point
54
Stationary points
Figure showing the three types of stationary points (a) inflection point
(b) minimum (c) maximum
55
Relative and Global Optimum
• A function is said to have a relative or local minimum at x = x* if
f ( x* ) f ( x h) for all sufficiently small positive and negative
values of h, i.e. in the near vicinity of the point x.
• Similarly, a point x* is called a relative or local maximum if
f ( x* ) f ( x h) for all values of h sufficiently close to zero.
• A function is said to have a global or absolute minimum at x = x* if
f ( x* ) f ( x) for all x in the domain over which f(x) is defined.
• Similarly, a function is said to have a global or absolute maximum at x
= x* if f ( x* ) f ( x) for all x in the domain over which f (x) is
defined.
56
Relative and Global Optimum
…contd.
A1, A2, A3 = Relative maxima
A2 = Global maximum
B1, B2 = Relative minima
B1 = Global minimum
.
A2
Relative minimum is
also global optimum
.
f(x) f(x)
.
A1
.
A3
.
B2
.
B1
x x
a b a b
Fig. 2
57
Functions of two variables
The concept discussed for one variable functions may be
easily extended to functions of multiple variables.
Functions of two variables are best illustrated by contour
maps, analogous to geographical maps.
A contour is a line representing a constant value of f(x) as
shown in the following figure. From this we can identify
maxima, minima and points of inflection.
58
A contour plot
59
Necessary conditions
As can be seen in the above contour map, perturbations from
points of local minima in any direction result in an increase in
the response function f(x), i.e.
the slope of the function is zero at this point of local
minima.
Similarly, at maxima and points of inflection as the slope is
zero, the first derivative of the function with respect to the
variables are zero.
60
Necessary conditions …contd.
Which gives us f 0; f 0 at the stationary points. i.e. the
x1 x2
61
Sufficient conditions
Consider the following second order derivatives:
2 f 2 f 2 f
; 2;
x1 x2 x1x2
2
The Hessian matrix defined by H is made using the above second order
derivatives.
2 f 2 f
x 2
x1x2
H 2 1
f 2 f
2
x1x2 x2 [ x , x ]
1 2
62
Sufficient conditions …contd.
The value of determinant of the H is calculated and
if H is positive definite then the point X = [x1, x2] is a
point of local minima.
if H is negative definite then the point X = [x1, x2] is a
point of local maxima.
if H is neither then the point X = [x1, x2] is neither a
point of maxima nor minima.
63
Variables and Constraints
Variables
These are essential. If there are no variables, we cannot define the
objective function and the problem constraints.
Constraints
Even though Constraints are not essential, it has been argued that
almost all problems really do have constraints.
In many practical problems, one cannot choose the design variable
arbitrarily. Design constraints are restrictions that must be satisfied to
produce an acceptable design.
Optimization Methods
Constraints (contd.)
Constraints can be broadly classified as :
Optimization Methods
Constraint Surfaces
Consider the optimization problem presented earlier with only
inequality constraints gi(X) . The set of values of X that satisfy the
equation gi(X) forms a boundary surface in the design space
called a constraint surface.
The constraint surface divides the design space into two regions:
one with gi(X) < 0 (feasible region) and the other in which gi(X) >
0 (infeasible region). The points lying on the hyper surface will
satisfy gi(X) =0.
Optimization Methods
The figure shows a hypothetical two-dimensional design space
where the feasible region is denoted by hatched lines.
Behavior
constraint
Infeasible g2 0
region
Side
constraint Feasible region
g3 ≥ 0 Behavior
constraint
. g1 0
. Bound
acceptable point.
Optimization Methods
Formulation of design problems as mathematical
programming problems
Optimization Methods
3. Develop via mathematical expressions a valid process model that relates the
input-output variables of the process and associated coefficients.
a) Include both equality and inequality constraints
b) Use well known physical principles
c) Identify the independent and dependent variables to get the number of
degrees of freedom
4. If the problem formulation is too large in scope:
a) break it up into manageable parts/ or
b) simplify the objective function and the model
5. Apply a suitable optimization technique for mathematical statement of the
problem.
6. Examine the sensitivity of the result to changes in the coefficients in the
problem and the assumptions.
Optimization Methods
1. Introduction
• Mathematical optimization problem:
minimize f 0 ( x)
subject to g i ( x) bi , i 1,...., m
• f0 : Rn R: objective function
• x=(x1,…..,xn): design variables (unknowns of the problem,
they must be linearly independent)
• gi : Rn R: (i=1,…,m): inequality constraints
70
1. Introduction
• If a point x* corresponds to the minimum value of the function f (x), the
same point also corresponds to the maximum value of the negative of
the function, -f (x). Thus optimization can be taken to mean
minimization since the maximum of a function can be found by seeking
the minimum of the negative of the same function.
71
1. Introduction
Constraints
72
1. Introduction
Constraint Surface
• For illustration purposes, consider an optimization problem with only
inequality constraints gj (X) 0. The set of values of X that satisfy
the equation gj (X) =0 forms a hypersurface in the design space and
is called a constraint surface.
73
1. Introduction
Constraint Surface
• Note that this is a (n-1) dimensional subspace, where n is the
number of design variables. The constraint surface divides the
design space into two regions: one in which gj (X) 0and the other
in which gj (X) 0.
74
1. Introduction
Constraint Surface
• Thus the points lying on the hypersurface will satisfy the constraint
gj (X) critically whereas the points lying in the region where gj (X) >0
are infeasible or unacceptable, and the points lying in the region
where gj (X) < 0 are feasible or acceptable.
75
1. Introduction
Constraint Surface
• In the below figure, a hypothetical two dimensional design space is
depicted where the infeasible region is indicated by hatched lines. A
design point that lies on one or more than one constraint surface is
called a bound point, and the associated constraint is called an
active constraint.
76
1. Introduction
Constraint Surface
• Design points that do not lie on any constraint surface are known as
free points.
77
1. Introduction
Constraint Surface
Depending on whether a
particular design point belongs to
the acceptable or unacceptable
regions, it can be identified as one
of the following four types:
78
1. Introduction
• The conventional design procedures aim at finding an acceptable or
adequate design which merely satisfies the functional and other
requirements of the problem.
• In general, there will be more than one acceptable design, and the
purpose of optimization is to choose the best one of the many
acceptable designs available.
79
1. Introduction
• In civil engineering, the objective is usually taken as the
minimization of the cost.
80
1. Introduction
• With multiple objectives there arises a possibility of conflict, and one
simple way to handle the problem is to construct an overall objective
function as a linear combination of the conflicting multiple objective
functions.
• Thus, if f1 (X) and f2 (X) denote two objective functions, construct a new
(overall) objective function for optimization as:
81
1. Introduction
• The locus of all points satisfying f (X) = c = constant forms a
hypersurface in the design space, and for each value of c there
corresponds a different member of a family of surfaces. These surfaces,
called objective function surfaces, are shown in a hypothetical two-
dimensional design space in the figure below.
82
1. Introduction
• Once the objective function surfaces are drawn along with the constraint
surfaces, the optimum point can be determined without much difficulty.
• But the main problem is that as the number of design variables exceeds
two or three, the constraint and objective function surfaces become
complex even for visualization and the problem has to be solved purely
as a mathematical problem.
83
Example
Example:
Design a uniform column of tubular section to carry a compressive load P=2500 kgf
for minimum cost. The column is made up of a material that has a yield stress of 500
kgf/cm2, modulus of elasticity (E) of 0.85e6 kgf/cm2, and density () of 0.0025 kgf/cm3.
The length of the column is 250 cm. The stress induced in this column should be less
than the buckling stress as well as the yield stress. The mean diameter of the column
is restricted to lie between 2 and 14 cm, and columns with thicknesses outside the
range 0.2 to 0.8 cm are not available in the market. The cost of the column includes
material and construction costs and can be taken as 5W + 2d, where W is the weight
in kilograms force and d is the mean diameter of the column in centimeters.
84
Example
Example:
P 2500
induced stress i
dt x1 x2
86
Example
• The buckling stress for a pin connected column is given by:
I (d o4 d i4 ) (d o2 d i2 )( d o d i )( d o d i )
64 64
64
(d t ) 2
(d t ) 2 (d t ) (d t )(d t ) (d t )
dt (d 2 t 2 ) x1 x2 ( x12 x22 )
8 8
87
Example
• Thus, the behaviour constraints can be restated as:
2500
g1 ( X) 500 0
x1 x2
2500 2 (0.85 106 )( x12 x22 )
g 2 ( X) 0
x1 x2 8(250) 2
2 d 14
0.2 t 0.8
88
Example
• The side constraints can be expressed in standard form as:
g 3 ( X) x1 2 0
g 4 ( X) x1 14 0
g 5 ( X) x2 0.2 0
g 6 ( X) x2 0.8 0
89
Example
• For a graphical solution, the constraint surfaces are to be
plotted in a two dimensional design space where the two axes
represent the two design variables x1 and x2. To plot the first
constraint surface, we have:
2500
g1 ( X) 500 0 x1 x2 1.593
x1 x2
• Thus the curve x1x2=1.593 represents the constraint surface
g1(X)=0. This curve can be plotted by finding several points on
the curve. The points on the curve can be found by giving a
series of values to x1 and finding the corresponding values of x2
that satisfy the relation x1x2=1.593 as shown in the Table below:
x1 2 4 6 8 10 12 14
x2 0.7965 0.3983 0.2655 0.199 0.1593 0.1328 0.114
90
Example
• The infeasible region represented by g1(X)>0 or x1x2< 1.593 is
shown by hatched lines. These points are plotted and a curve P1Q1
passing through all these points is drawn as shown:
91
Example
• Similarly the second
constraint g2(X) < 0 can
be expressed as:
x1 x2 ( x x ) 47.3
2
1
2
2
93
Example
• Next, the contours of the
objective function are to be
plotted before finding the
optimum point. For this, we
plot the curves given by:
f ( X) 9.82 x1 x2 2 x1 c
constant
94
Example
• For f (X) 9.82x1x2 2x1 50.0
x2 0.1 0.2 0.3 0.4 0.5 0.6 0.7
95
Example
• These contours are shown in the
figure below and it can be seen
that the objective function can not
be reduced below a value of 26.53
(corresponding to point B) without
violating some of the constraints.
Thus, the optimum solution is
given by point B with d*=x1*=5.44
cm and t*=x2*=0.293 cm with
fmin=26.53.
96
Classification of Optimization
Problems
Optimization problems can be classified based on the
type of constraints, nature of design variables,
physical structure of the problem, nature of the
equations involved, deterministic nature of the
variables, permissible value of the design variables,
separability of the functions and number of objective
functions. These classifications are briefly discussed
in this lecture.
Optimization Methods
Classification based on existence of constraints
Optimization Methods
Classification based on the nature of the design
variables
Optimization Methods
The problem can be defined as follows
The length of the footing (l) the loads P1 and P2 , the distance between the loads are
assumed to be constant and the required optimization is achieved by varying b and d.
Such problems are called parameter or static optimization problems.
Optimization Methods
Classification based on the nature of the design variables (contd.)
Optimization Methods
l
The length of the footing (l) the loads P1 and P2 , the distance between the loads are
assumed to be constant and the required optimization is achieved by varying b and d.
Such problems are called trajectory or dynamic optimization problems.
Classification based on the physical structure of the
problem
Based on the physical structure, we can classify optimization
problems are classified as optimal control and non-optimal
control problems.
(i) An optimal control (OC) problem is a mathematical
programming problem involving a number of stages, where
each stage evolves from the preceding stage in a prescribed
manner.
It is defined by two types of variables: the control or design
variables and state variables.
Optimization Methods
The problem is to find a set of control or design variables such that the total
objective function (also known as the performance index, PI) over all stages
is minimized subject to a set of constraints on the control and state
variables. An OC problem can be stated as follows:
Where xi is the ith control variable, yi is the ith state variable, and f i is the
contribution of the ith stage to the total objective function. gj, hk, and qi are
the functions of xj, yj ; xk, yk and xi and yi, respectively, and l is the total
number of states. The control and state variables xi and yi can be vectors in
some cases.
(ii) The problems which are not optimal control problems are
called non-optimal control problems.
Optimization Methods
Classification based on the nature of the equations
involved
Optimization Methods
Classification based on the nature of the equations
involved (contd.)
(i) Linear programming problem
If the objective function and all the constraints are linear functions of the design
variables, the mathematical programming problem is called a linear programming
(LP) problem.
often stated in the standard form :
subject to
Optimization Methods
Classification based on the nature of the equations
involved (contd.)
(ii) Nonlinear programming problem
If any of the functions among the objectives and constraint functions is
nonlinear, the problem is called a nonlinear programming (NLP) problem
this is the most general form of a programming problem.
Optimization Methods
Classification based on the nature of the equations
involved (contd.)
(iii) Geometric programming problem
A geometric programming (GMP) problem is one in which the objective
function and constraints are expressed as polynomials in X.
A polynomial with N terms can be expressed as
Optimization Methods
Classification based on the nature of the equations
involved (contd.)
where N0 and Nk denote the number of terms in the objective and kth constraint
function, respectively.
(iv) Quadratic programming problem
A quadratic programming problem is the best behaved nonlinear programming
problem with a quadratic objective function and linear constraints and is concave
(for maximization problems). It is usually formulated as follows:
Subject to:
Optimization Methods
Classification based on the permissible values of the
decision variables
Optimization Methods
Classification based on deterministic nature of the
variables
Optimization Methods
Classification based on separability of the functions
subject to :
where bj is a constant.
Optimization Methods
Classification based on the number of objective
functions
Under this classification objective functions can be classified as
single and multiobjective programming problems.
(i) Single-objective programming problem in which there is only a single
objective.
(ii) Multi-objective programming problem
A multiobjective programming problem can be stated as follows:
Optimization Methods
Classical and Advanced
Techniques for
Optimization
Optimization Methods
Classical Optimization Techniques
The classical optimization techniques are useful in finding the
optimum solution or unconstrained maxima or minima of continuous
and differentiable functions.
These are analytical methods and make use of differential calculus in
locating the optimum solution.
The classical methods have limited scope in practical applications as
some of them involve objective functions which are not continuous
and/or differentiable.
Yet, the study of these classical techniques of optimization form a basis
for developing most of the numerical techniques that have evolved into
advanced techniques more suitable to today’s practical problems
Optimization Methods
Classical Optimization Techniques (contd.)
These methods assume that the function is differentiable twice with respect to the
design variables and the derivatives are continuous.
Optimization Methods
Numerical Methods of Optimization
Optimization Methods
Numerical Methods of Optimization (contd.)
Stochastic programming: studies the case in which some of the
constraints depend on random variables.
Dynamic programming: studies the case in which the optimization
strategy is based on splitting the problem into smaller sub-problems.
Combinatorial optimization: is concerned with problems where the set
of feasible solutions is discrete or can be reduced to a discrete one.
Infinite-dimensional optimization: studies the case when the set of
feasible solutions is a subset of an infinite-dimensional space, such as a
space of functions.
Constraint satisfaction: studies the case in which the objective function
f is constant (this is used in artificial intelligence, particularly in
automated reasoning).
Optimization Methods
Advanced Optimization Techniques
Hill climbing: it is a graph search algorithm where the current path is
extended with a successor node which is closer to the solution than the
end of the current path.
In simple hill climbing, the first closer node is chosen whereas in
steepest ascent hill climbing all successors are compared and the
closest to the solution is chosen. Both forms fail if there is no closer node.
This may happen if there are local maxima in the search space which are
not solutions.
Hill climbing is used widely in artificial intelligence fields, for reaching a
goal state from a starting node. Choice of next node/ starting node can be
varied to give a number of related algorithms.
Optimization Methods
Thank You
Optimization Methods
Simulated annealing
The name and inspiration come from annealing process in metallurgy,
a technique involving heating and controlled cooling of a material to
increase the size of its crystals and reduce their defects.
The heat causes the atoms to become unstuck from their initial positions (a
local minimum of the internal energy) and wander randomly through
states of higher energy;
the slow cooling gives them more chances of finding configurations with
lower internal energy than the initial one.
Optimization Methods
Genetic algorithms
Optimization Methods
Genetic algorithms (contd.)
Genetic algorithms are typically implemented as a computer simulation,
in which a population of abstract representations (called chromosomes)
of candidate solutions (called individuals) to an optimization problem,
evolves toward better solutions.
The new population is then used in the next iteration of the algorithm.
Optimization Methods
Ant colony optimization
In the real world, ants (initially) wander randomly, and upon finding food return to
their colony while laying down pheromone trails. If other ants find such a path,
they are likely not to keep traveling at random, but instead follow the trail laid by
earlier ants, returning and reinforcing it if they eventually find food
Over time, however, the pheromone trail starts to evaporate, thus reducing its
attractive strength. The more time it takes for an ant to travel down the path and
back again, the more time the pheromones have to evaporate.
A short path, by comparison, gets marched over faster, and thus the pheromone
density remains high
Optimization Methods
Ant colony optimization (contd.)
Thus, when one ant finds a good (short) path from the colony to a food source,
other ants are more likely to follow that path, and such positive feedback
eventually leaves all the ants following a single path.
The idea of the ant colony algorithm is to mimic this behavior with "simulated
ants" walking around the search space representing the problem to be solved.
Optimization Methods
Optimization Methods