Genetic Algorithms - Knapsack Problem - Knapsack Problem
Genetic Algorithms - Knapsack Problem - Knapsack Problem
- Knapsack Problem
Genetic Algorithms
• An algorithm is a set of instructions that is repeated to
solve a problem.
• A genetic algorithm conceptually follows steps inspired
by the biological processes of evolution.
• Genetic Algorithms follow the idea of SURVIVAL OF THE
FITTEST- Better and better solutions evolve from
previous generations until a near optimal solution is
obtained.
• Also known as evolutionary algorithms, genetic
algorithms demonstrate self organization and adaptation
similar to the way that the fittest biological organism
survive and reproduce.
Genetic Algorithms
• A genetic algorithm is an iterative procedure that
represents its candidate solutions as strings of genes
called chromosomes.
• Genetic Algorithms are often used to improve the
performance of other AI methods such as expert
systems or neural networks.
• The method learns by producing offspring that are
better and better as measured by a fitness function,
which is a measure of the objective to be obtained
(maximum or minimum).
How offspring are produced
• Reproduction- Through reproduction, genetic
algorithms produce new generations of improved
solutions by selecting parents with higher fitness
ratings or by giving such parents a greater probability
of being contributors and by using random selection.
• Crossover- Many genetic algorithms use strings of
binary symbols for chromosomes, as in our Knapsack
example, to represent solutions. Crossover means
choosing a random position in the string (say, after 2
digits) and exchanging the segments either to the right
or to the left of this point with another string
partitioned similarly to produce two new off spring.
Crossover Example
• Parent A 011011
• Parent B 101100
• “Mate the parents by splitting each number as
shown between the second and third digits
(position is randomly selected)
• 01*1011 10*1100
Crossover Example
Parent 1 1 0 1 0 1 1 1
Parent 2 1 1 0 0 0 1 1
Child 1 1 0 1 0 0 1 1
Child 2 1 1 0 0 1 1 0 Mutation
Outline of the Basic Genetic Algorithm
1. [Start] Generate random population of n
chromosomes (suitable solutions for the problem)
2. [Fitness] Evaluate the fitness f(x) of each
chromosome x in the population
3. [New population] Create a new population by
repeating following steps until the new population
is complete
4. [Selection] Select two parent chromosomes
from a population according to their fitness (the
better fitness, the bigger chance to be selected)
The idea is to choose the better parents.
Outline of the Basic Genetic Algorithm
5. [Crossover] With a crossover probability cross over the
parents to form a new offspring (children). If no crossover
was performed, offspring is an exact copy of parents.
6. [Mutation] With a mutation probability mutate new
offspring at each locus (position in chromosome).
7. [Accepting] Place new offspring in a new
population
8. [Replace] Use new generated population for a further
run of algorithm
9. [Test] If the end condition is satisfied, stop, and return
the best solution in current population
10. [Loop] Go to step 2
Flow Diagram of the Genetic
Algorithm Process
Describe
Problem
Generate
Initial
Solutions
No
• Item: 1 2 3 4 5 6 7
• Benefit: 5 8 3 2 7 9 4
• Weight: 7 8 4 10 4 6 4
• Knapsack holds a maximum of 22 pounds
• Fill it to get the maximum benefit
• Solutions take the form of a string of 1’s and 0’s
• Solutions: Also known as strings of genes called
Chromosomes
– 1. 0101010
– 2. 1101100
– 3. 0100111
Example: The Knapsack Problem
• We represent a solution as a string of seven 1s and
0s and the fitness function as the total benefit, which
is the sum of the gene values in a string solution
times their representative benefit coefficient.
• The method generates a set of random solutions
(initial parents), uses total benefit as the fitness
function and selects the parents randomly to create
generations of offspring by crossover and mutation.
Knapsack Example
• Typically, a string of 1s and 0s can represent a
solution.
• Possible solutions generated by the system
using Reproduction, Crossover, or Mutations
– 1. 0101010
– 2. 1101100
– 3. 0100111
Knapsack Example
Solution 1
Item 1 2 3 4 5 6 7
Solution 0 1 0 1 0 1 0
Benefit 5 8 3 2 7 9 4
Weight 7 8 4 10 4 6 4
• Benefit 8 + 2 + 9 = 19
• Weight 8 + 10 + 6 = 24
Knapsack Example
Solution 2
Item 1 2 3 4 5 6 7
Solution 1 1 0 1 1 0 0
Benefit 5 8 3 2 7 9 4
Weight 7 8 4 10 4 6 4
• Benefit 5 + 8 + 7 = 20
• Weight 7 + 8 + 4 = 19
Knapsack Example
Solution 3
Item 1 2 3 4 5 6 7
Solution 0 1 0 0 1 1 1
Benefit 5 8 3 2 7 9 4
Weight 7 8 4 10 4 6 4
• Benefit 8 + 7 + 9 + 4 = 28
• Weight 8 + 4 + 6 + 4 = 22
Knapsack Example
• Solution 3 is clearly the best solution and has met
our conditions, therefore, item number 2, 5, 6, and 7
will be taken on the hiking trip. We will be able to get
the most benefit out of these items while still having
weight equal to 22 pounds.
• This is a simple example illustrating a genetic
algorithm approach.
Genetic Algorithms
• Genetic Algorithms are a type of machine learning
for representing and solving complex problems.
• They provide a set of efficient, domain-independent
search heuristics for a broad spectrum of
applications.
• A genetic algorithm interprets information that
enables it to reject inferior solutions and accumulate
good ones, and thus it learns about its universe.
Genetic Algorithm Application Areas
Item 1 2 3 4 5 6 7
Solution 1 0 1 0 1 0 1 0 • Benefit 8 + 2 + 9 = 19
Benefit 5 8 3 2 7 9 4 • Weight 8 + 10 + 6 = 24
Weight 7 8 4 10 4 6 4
Item 1 2 3 4 5 6 7
Solution 2 1 1 0 0 1 0 0 Benefit 5 + 8 + 7 = 20
Weight 7 + 8 + 4 = 19
Benefit 5 8 3 2 7 9 4
Weight 7 8 4 10 4 6 4
Item 1 2 3 4 5 6 7
Solution 3 0 1 0 0 1 1 1 Benefit 8 + 7 + 9 + 4 = 28
Benefit 5 8 3 2 7 9 4 Weight 8 + 4 + 6 + 4 = 22
Weight 7 8 4 1 4 6 4
0
Application in M/C learning
• M/C learning key application field of GA
• Major concern is with artificial neural nets
• Works with NN in 3 basic ways
1. Adjust parameters – act as learning algorithm
2. Determine the structure of neural net ie. the no.
of neurons in each layer
3. Automatically adjust the parameters of a prototype
learning equation
GA alternative to BP learning
Adaptation of learning rules/control law by GA
GA alternative to BP learning
• BP adjust the weight.
• It may some time trap at local minima
• GA- mutation overcome this
• i/p : x1,x2,x3
• o/p :Y1,Y2
• Weight 1st layer :W1…W6
• Weight 2nd layer : G1..G4
• Desired o/p (d1,d2)
• Selection function is to minimize Z
• Z=[(d1-y1)2+(d2-y2)2]1/2
• Chromosome present has 10 fields (w1,…,W6,G1,..G4)
• Typical crossover and mutation can be used to evolve the
weight
Adaptation of learning
• Supervised learning must generate desired o/p
for a given i/p
• Assume multilayered feed forward NN is used as the learning
agent
• Ok be the kth o/p node It be the tth i/p node
• wij weight connected from node i to node j
• The learning rule in general can be written as ∆wij =f(It,Oj,wij)
• wij(t+1)= wij(t)+∆ wij
• Let f(Ii, Oj, wij) be a function of a,b,c,d,e
• Fitness of the chromosome is measured by error signal at the
o/p layer
• After number of evolution the near optimal value of a,b,c,d,e
can be found – ie. new learning rule is framed
Issues for GA Practitioners