Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
148 views

Genetic Algorithm

The document discusses solving optimization problems. It defines optimization as finding the optimum or minimum/maximum value of a function. It then defines an optimization problem formally and describes different types of optimization problems such as constrained, unconstrained, linear, nonlinear, integer programming problems. Finally, it discusses traditional approaches used to solve optimization problems, including analytical methods, numerical methods and specialized algorithms like linear programming, dynamic programming etc. It provides an example of using analytical methods to find optimum points of a polynomial function.

Uploaded by

Jerry
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
148 views

Genetic Algorithm

The document discusses solving optimization problems. It defines optimization as finding the optimum or minimum/maximum value of a function. It then defines an optimization problem formally and describes different types of optimization problems such as constrained, unconstrained, linear, nonlinear, integer programming problems. Finally, it discusses traditional approaches used to solve optimization problems, including analytical methods, numerical methods and specialized algorithms like linear programming, dynamic programming etc. It provides an example of using analytical methods to find optimum points of a polynomial function.

Uploaded by

Jerry
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 46

Solving Optimization Problems

Debasis Samanta

IIT Kharagpur
dsamanta@sit.iitkgp.ernet.in

06.03.2018

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 1 / 22


Introduction to Solving Optimization Problems

Today’s Topics
Concept of optimization problem
Defining an optimization problem
Various types of optimization problems
Traditional approaches to solve optimization problems
Limitations of the traditional approaches

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 2 / 22


Concept of optimization problem

Optimization : Optimum value that is either minimum or maximum


value.
y = F (x)

Example:
2x − 6y = 11
or
y = (2x − 11) ÷ 6
Can we determine an optimum value for y ?
Similarly, in the following case

3x + 4y ≥ 56.

These are really not related to optimization problem!

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 3 / 22


Defining an optimization problem
Suppose, we are to design an optimal pointer made of some material with
density ρ. The pointer should be as large as possible, no mechanical
breakage and deflection of pointing at end should be negligible.
The task is to select the best pointer out of many all possible pointers.

Diameter d

Length l

Suppose, s is the strength of the pointer.


Mass of the stick is denoted by
2
M = 13 Π d2 ∗ l ∗ ρ = 1
12 Π ∗ d2 ∗ l ∗ ρ
Deflection : δ = f1 (d, l, ρ)
Strength : s = f2 (d, l, ρ)
Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 4 / 22
Defining an optimization problem

The problem can be stated as


Objective function
1
Minimize M = 12 Π ∗ d2 ∗ l ∗ ρ
Subject to
δ ≤ δth , where δth = allowable deflection
s ≥ sth , where sth = required strength
and
dmin ≤ d ≤ dmax
lmin ≤ l ≤ lmax

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 5 / 22


Defining Optimization Problem

An optimization problem can be formally defined as follows:


Maximize (or Minimize)
yi = fi (x1 , x2 , · · · , xn )
where i = 1, 2 · · · k, k ≥ 1
Subject to
gi (x1 , x2 , · · · xn ) ROPi ci
where i = 1, 2, ..., j, j ≥ 0. ROPi denotes some relational operator
and ci = 1, 2, · · · j are some constants.
and
xi ROP di , for all i=1,2...n (n ≥ 1)
Here, xi denotes a design parameter and di is some constant.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 6 / 22


Some Benchmark Optimization Problems

Exercises: Mathematically define the following optimization problems.


Traveling Salesman Problem
Knapsack Problem
Graph Coloring Problem
Job Machine Assignment Problem
Coin Change Problem
Binary search tree construction problem

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 7 / 22


Types of Optimization Problem

Unconstrained optimization problem


Problem is without any functional constraint.

Example:
Minimize y = f (x1 , x2 ) = (x1 − 5)2 + (x2 − 3)3
where x1 , x2 ≥ 0

Note: Here, gj = NULL

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 8 / 22


Types of Optimization Problem

Constrained optimization problem


Optimization problem with at one or more functional constraint(s).

Example:
Maximize y = f (x1 , x2 , · · · , xn )

Subject to
gi (x1 , x2 , · · · , xni ) ≥ ci
where i = 1, 2, · · · , k and k > 0
and
x1 , x2 , · · · , xn are design parameters.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 9 / 22


Types of Optimization Problem

Integer Programming problem


If all the design variables take some integer values.

Example:
Minimize y = f (x1 , x2 ) = 2x1 + x2
Subject to
x1 + x2 ≤ 3
5x1 + 2x2 ≤ 9
and
x1 , x2 are integer variables.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 10 / 22


Types of Optimization Problem

Real-valued problem
If all the design variables are bound to take real values.

Mixed-integer programming problem


Some of the design variables are integers and the rest of the variables take
real values.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 11 / 22


Types of Optimization Problem

Linear optimization problem


Both objective functions as well as all constraints are found to be some
linear functions of design variables.
Example:
Maximize y = f (x1 , x2 ) = 2x1 + x2
Subject to
x1 + x2 ≤ 3
5x1 + 2x2 ≤ 10
and
x1 , x2 ≥ 0

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 12 / 22


Types of Optimization Problem

Non-linear optimization problem

If either the objective function or any one of the functional constraints are
non-linear function of design variables.
Example:
Maximize y = f (x1 , x2 ) = x12 + 5x23
Subject to
x14 + 3x22 ≤ 629
2x13 + 4x23 ≤ 133
and
x1 , x2 ≥ 0

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 13 / 22


Traditional approaches to solve optimization
problems

Optimization Methods

Linear Programming Non linear Programming


Specialized Algorithm
Method Method

Graphical Method Dynamic Programing


Simplex Method Branch & Bound
Greedy Method
Single Variable Multi Variable
Divide & Conquer

Numerical Analytical Constrained Unconstrained


Method Method Optimization Optimization

Unrestricted method Random Walk


Exhaustive method Univeriate Method
Fibonacci method Pattern Search
Elimination Method Interpolation Method Lagrangian method Steepest Descent
Conjugate Gradient
Unrestricted method Quadratic Quasi Newton
Exhaustive method Cubic Variable Match
Fibonacci method Direct root
Dichotomous Search
Golden Section method

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 14 / 22


Example : Analytical Method

Suppose, the objective function: y = f (x). Let f (x) be a polynomial of


degree m and (m > 0)
If y 0 = f 0 (x) = 0 for some x = x ∗ , then we say that
y is optimum (i.e. either minimum or maximum point exist) at the point
x = x ∗.
If y 0 = f 0 (x) 6= 0 for some x = x ∗ , then we say that
there is no optimum value at x = x ∗ (i.e. x = x ∗ is an inflection point)
An inflection point is also called a saddle point.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 15 / 22


Example : Analytical Method

Note:
An inflection point is a point, that is, neither a maximum nor a minimum
at that point.
Following figure explains the concepts of minimum, maximum and saddle
point.

Maximum
Saddle Points

Minimum

x1 * x2*
x

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 16 / 22


Example : Analytical Method

Let us generalize the concept of ”Analytical method”.


If y = f (x) is a polynomial of degree m, then there are m number of
candidate points to be checked for optimum or saddle points.
Suppose, y n is the nth derivative of y.
To further investigate the nature of the point, we determine (first
non-zero) (n − th) higher order derivative
y n = f n (x = x ∗ )
There are two cases.
Case 1:
If y n 6= 0 for n=odd number, then x ∗ is an inflection point.
Case 2:
If y n = 0 for n = odd number, then there exist an optimum point at x ∗ .

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 17 / 22


Example : Analytical Method

In order to decide the point x ∗ as minimum or maximum, we have to find


the next higher order derivative, that is y n+1 = f n+1 (x = x ∗ ).
There are two sub cases may be:
Case 2.1:
If y n = f n (x = x ∗ ) is positive then x is a local minimum point.
Case 2.2:
If y n = f n (x = x ∗ ) is negative then x is a local maximum point.

x1 * z1* x2* z2 * x3* z3* x4* z4*

If y n+1 = f n+1 (x = x ∗ ) = 0, then we are to repeat the next higher order


derivative.
Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 18 / 22
Question
y = f (x)
d 2y ∗
dx = +ve ⇒ x = x1
d 4y
dx = −ve ⇒ x = x2∗
d 6y
dx = ±ve ⇒ x = x3∗

Optimal Solution
x=x2*

x=x1* x=x3* x

Is the analytical method solves optimization problem with multiple input


variables?
If ”Yes”, than how?
If ”No”, than why not?
Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 19 / 22
Exercise

Determine the minimum or maximum or saddle points, if any for the


following single variable function f (x)
2
f (x) = x2 + 125
x
for some real values of x.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 20 / 22


Duality Principle

Principle
A Minimization (Maximization) problem is said to have dual problem if it
is converted to the maximization (Minimization) problem.
The usual conversion from maximization ⇔ minimization
y = f (x) ⇔ y ∗ = −f (x)
y = f (x) ⇔ y ∗ = f (x)
1

Maximization Problem
y = f(x)

y x
Minimization Problem
y* = f(x)

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 21 / 22


Limitations of the traditional optimization approach

Computationally expensive.
For a discontinuous objective function, methods may fail.
Method may not be suitable for parallel computing.
Discrete (integer) variables are difficult to handle.
Methods may not necessarily adaptive.

Soft Computing techniques have been evolved to address the above


mentioned limitations of solving optimization problem with traditional
approaches.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 22 / 22


Evolutionary Algorithms

The algorithms, which follow some biological and physical behaviors:

Biologic behaviors:
Genetics and Evolution –> Genetic Algorithms (GA)
Behavior of ant colony –> Ant Colony Optimization (ACO)
Human nervous system –> Artificial Neural Network (ANN)

In addition to that there are some algorithms inspired by some physical


behaviors:
Physical behaviors:
Annealing process –> Simulated Annealing (SA)
Swarming of particle –> Particle Swarming Optimization (PSO)
Learning –> Fuzzy Logic (FL)

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 3 / 26


Genetic Algorithm

It is a subset of evolutionary algorithm:

Ant Colony optimization


Swarm Particle Optimization

Models biological processes:


Genetics
Evolution
To optimize highly complex objective functions:
Very difficult to model mathematically
NP-Hard (also called combinatorial optimization) problems (which
are computationally very expensive)
Involves large number of parameters (discrete and/or continuous)

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 4 / 26


Background of Genetic Algorithm

Firs time itriduced by Ptrof. John Holland (of Michigan University, USA,
1965).
But, the first article on GA was published in 1975.

Principles of GA based on two fundamental biological processes:


Genetics: Gregor Johan Mendel (1865)
Evolution: Charles Darwin (1875)

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 5 / 26


A brief account on genetics
The basic building blocks in living bodies are cells. Each cell carries
the basic unit of heredity, called gene
Nucleus

Chromosome

Other cell bodies

For a particular specie, number of chromosomes is fixed.


Examples
Mosquito: 6
Frogs: 26
Human: 46
Goldfish: 94
etc.
Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 6 / 26
A brief account on genetics
Genetic code

Spiral helix of protein substance is called DNA.


For a specie, DNA code is unique, that is, vary uniquely from one
to other.
DNA code (inherits some characteristics from one generation to
next generation) is used as biometric trait.
Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 7 / 26
A brief account on genetics

Reproduction

+ = Organism’s cell :
Cell division
x y
gamete diploid

(Reproductive cell has Each chromosome from


haploid half the number of diploid both haploids are combined
chromosomes) to have full numbers

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 8 / 26


A brief account on genetics
Crossing over

Kinetochore

Information from
two different Combined into so that diversity
organism’s body in information is possible
cells
Random crossover points
makes infinite diversities

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 9 / 26


A brief account on evolution
Evolution : Natural Selection

Four primary premises:

1 Information propagation: An offspring has many of its


characteristics of its parents (i.e. information passes from parent
to its offspring). [Heredity]
2 Population diversity: Variation in characteristics in the next
generation. [Diversity]
3 Survival for exitence: Only a small percentage of the offspring
produced survive to adulthood. [Selection]
4 Survival of the best: Offspring survived depends on their
inherited characteristics. [Ranking]

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 10 / 26


A brief account on evolution

Mutation:

To make the process forcefully dynamic when variations in population


going to stable.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 11 / 26


Biological process : A quick overview

Genetics

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 12 / 26


Working of Genetic Algorithm

Definition of GA:
Genetic algorithm is a population-based probabilistic search and
optimization techniques, which works based on the mechanisms of
natural genetics and natural evaluation.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 13 / 26


Framework of GA

Start

Note:
An individual in the
population is
corresponding to a
Initial Population possible solution

No
Converge ? Selection

Yes

Reproduction
Stop

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 14 / 26


Working of Genetic Algorithm

Note:

1 GA is an iterative process.
2 It is a searching technique.
3 Working cycle with / without convergence.
4 Solution is not necessarily guranteed. Usually, terminated with a
local optima.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 15 / 26


Framework of GA: A detail view

Start

Define parameters

Parameter representation

Create population
Initialize population
Apply cost
function to each of
the population

No
Converge ? Evaluate the fitness

Selection
Yes

Select Mate
Stop

Crossover

Reproduction
Mutation

Inversion

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 16 / 26


Optimization problem solving with GA

For the optimization problem, identify the following:

Objective function(s)

Constraint(s)

Input parameters

Fitness evaluation (it may be algorithm or mathematical formula)

Encoding

Decoding

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 17 / 26


GA Operators

In fact, a GA implementation involved with the realization of the


following operations.

1 Encoding: How to represent a solution to fit with GA framework.


2 Convergence: How to decide the termination criterion.
3 Mating pool: How to generate next solutions.
4 Fitness Evaluation: How to evaluate a solution.
5 Crossover: How to make the diverse set of next solutions.
6 Mutation: To explore other solution(s).
7 Inversion: To move from one optima to other.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 18 / 26


Different GA Strategies

Simple Genetic Algorithm (SGA)

Steady State Genetic Algorithm (SSGA)

Messy Genetic Algorithm (MGA)

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 19 / 26


Simple GA

Start

Create Initial population


of size N

Evaluate each individuals

Convergence
Yes Return the individual(s) with
Criteria meet ? best fitness value
No

Select Np individuals
(with repetition) Stop

Create mating pool (randomly) (Pair of


parent for generating new offspring)

Perform crossover and


create new offsprings
Reproduction

Mutate the offspring

Perform inversion on
the offspring

Replace all individuals in the last generation


with new offsprings created

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 20 / 26


Important parameters involved in Simple GA

SGA Parameters

Initial population size : N

Size of mating pool, Np : Np = p%ofN

Convergence threshold δ

Mutation µ

Inversion η

Crossover ρ

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 21 / 26


Salient features in SGA

Simple GA features:

Have overlapping generation (Only fraction of individuals are


replaced).

Computationally expensive.

Good when initial population size is large.

In general, gives better results.

Selection is biased toward more highly fit individuals; Hence, the


average fitness (of overall population) is expected to increase in
succession.
The best individual may appear in any iteration.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 22 / 26


Steady State Genetic Algorithm (SSGA)
Start

Generate Initial population of size N

Evaluate each individuals

Select two individual without


repetition

Crossover

Mutation

Inversion

Yes Reject the


offspring if
duplicated

No

Evaluate the offspring

If the offspring are better than the


worst individuals then replace the
worst individuals with the offspring

Convergence
meet ?

Return the solutions

Stop

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 23 / 26


Salient features in Steady-state GA

SGA Features:

Generation gap is small.


Only two offspring are produced in one generation.

It is applicable when

Population size is small


Chromosomes are of longer length
Evaluation operation is less computationally expensive (compare to
duplicate checking)

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 24 / 26


Salient features in Steady-state GA

Limitations in SSGA:

There is a chance of stuck at local optima, if


crossover/mutation/inversion is not strong enough to diversify the
population).

Premature convergence may result.

It is susceptible to stagnation. Inferiors are neglected or removed


and keeps making more trials for very long period of time without
any gain (i.e. long period of localized search).

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 25 / 26


***

Any Questions??

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 26 / 26

You might also like