Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Genetic and Evolutionary Computation: Who, What, Where, When, and Why

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Genetic and Evolutionary Computation:

Who, What, Where, When, and Why

Stefano Cagnoni Riccardo Poli


Department of Computer Engineering Department of Computer Science
University of Parma, Italy University of Essex, UK
cagnoni@ce.unipr.it rpoli@essex.ac.uk

Abstract What is Genetic and Evolutionary


Computation
In this paper, we start by providing a gentle introduc-
tion to the field of genetic and evolutionary computa- What were the main secrets behind Darwinian evolu-
tion, particularly focusing on genetic algorithms, but tion, that the pioneers of GEC stole to make them the
also touching upon other areas. We then move on to propelling fuel of evolutionary computation processes?
briefly analyse the geographic distribution of research
excellence in this field, focusing our attention specifi- Inheritance: individuals have a genetic representation
cally on Italian researchers. We then present our own (in nature, the chromosomes and the DNA) such that
interpretation of where and how genetic and evolution- it is possible for the offspring of an individual to in-
ary computation fits in the broader landscape of arti- herit some of the features of its parent.
ficial intelligence research. We conclude by making a
prediction of the future impact of this technology in Variation: the offspring are not exact copies of the
the short term. parents, but instead reproduction involves mecha-
nisms that create innovation, as new generations are
born.
Introduction Natural Selection: individuals best adapted to the
environment have longer life and higher chances of
Darwinian evolution is probably the most intriguing
mating and spreading their genetic makeup.
and powerful mechanism of nature mankind has ever
discovered. Its power is evident in the impressive level Clearly, there is a lot more to natural evolution than
of adaptation reached by all species of animals and these forces. However, like for many other nature-
plants in nature. It is intriguing because despite its inspired techniques, not all the details are necessary
simplicity and randomness it produces incredible com- to obtain working models of a natural system. The
plexity in a way that appears to be very directed, al- three ingredients listed above are in fact sufficient to
most purposeful. Like for other powerful natural phe- obtain artificial systems showing the main characteris-
nomena, it is no surprise then that several decades ago tic of natural evolution: the ability to search for highly
a few brilliant researchers in engineering and computer fit individuals.
science started wondering whether they could steal the For all these ingredients (representation, variation,
secrets behind Darwinian evolution and use them to selection) one can focus on different realisations. For
solve problems of practical interest in a variety of ap- example, in nature variation is produced both through
plication domains. These people were pioneers of a new mutations of the genome and through the effect of sex-
field, which after more than 30 years from its inception ually recombining the genetic material coming from
is now big and well established and goes under the name the parents when obtaining the offsprings chromo-
of Genetic and Evolutionary Computation (GEC). somes (crossover). This is why many different classes
An almost endless number of results and applications of evolutionary algorithms have been proposed over
of evolutionary algorithms have been reported in the lit- the years. So, depending on the structures under-
erature that show that the ideas of these pioneers were going evolution, on the reproduction strategies and
indeed right. Nowadays evolutionary techniques can the variation (or genetic) operators adopted, and so
routinely solve problems in domains such as automatic on, evolutionary algorithms can be grouped into: Ge-
design, optimisation, pattern recognition, control and netic Algorithms (GAs) (Holland, 1975), Genetic Pro-
many others. gramming (GP) (Koza, 1992), Evolution Strategies
(ESs) (Rechenberg, 1973; Schwefel, 1981), etc.
Copyright c 2006, American Association for Artificial In- The inventors of these different evolutionary algo-
telligence (www.aaai.org). All rights reserved. rithms (or EAs for brevity) have all had to make choices
(f) is absent. This algorithm is said to be generational
Table 1: Nature-to-computer mapping at the basis of because there is no overlap between generations (i.e. the
evolutionary algorithms. offspring population always replaces the parent popula-
Nature Computer tion). In generational EAs cloning is used to simulate
Individual Solution to a problem the survival of parents for more than one generation.
Population Set of solutions In the following we will analyse the various compo-
Fitness Quality of a solution nents of an EA in more detail, mainly concentrating on
Chromosome Representation for a solution the genetic algorithm, although most of what we will
(e.g. set of parameters) say also applies to other paradigms.
Gene Part of the representation of Representations
a solution (e.g. parameter or
degree of freedom) Traditionally, in GAs, solutions are encoded as binary
Crossover Search operators strings. Typically an adult individual (a solution for a
Mutation problem) takes the form of a vector of numbers. These
are often interpreted as parameters (for a plant, for a
Natural Selection Promoting the reuse of
design, etc.), but in combinatorial optimisation prob-
good (sub-)solutions
lems these numbers can actually represent configura-
tions, choices, schedules, paths and so on. Anything
Algorithm 1 Generic evolutionary algorithm. that can be represented on a digital computer can also
be represented in a GA using a binary representation.
1: Initialise population This is why, at least in principle, GAs have a really
2: Evaluate the fitness of each population member broad applicability. However, other, non-binary repre-
3: loop sentations are available, which may be more suitable,
4: Select sub-population for reproduction on the ba- e.g., for problems with real-valued parameters.
sis of fitness (Selection) Because normally the user of a GA has no
5: Copy some of the selected individuals without ideas as to what constitutes a good initial set of
change (Cloning or Reproduction) choices/parameters for adult individuals (tentative so-
6: Recombine the genes of selected parents lutions to a problem), the chromosomes to be manipu-
(Recombination or Crossover) lated by the GA are normally initialised in an entirely
7: Mutate the offspring population stochastically random manner. That is, the initial population is a
(Mutation) set of random binary strings or of random real-valued
8: Evaluate the fitness of the new population vectors.
9: Select the survivors on the basis of their fitness
10: If stopping criterion is satisfied then exit loop Selection in GAs
11: end loop
Selection is the operation by which individuals (i.e. their
chromosomes) are selected for mating or cloning. To
emulate natural selection, individuals with a higher fit-
as to which bits of nature have a corresponding compo- ness should be selected with higher probability. There
nent in their algorithms. These choices are summarised are many models of selection. We briefly describe three
in the nature-to-computer mapping shown in Table 1. of the most frequently used ones below.
That is, the notion of individual in nature corresponds Fitness proportionate selection, besides being the
to a tentative solution to a problem of interest in an most direct translation into the computational model of
EA. The fitness (ability to reproduce and have fertile the principles of evolution, is probably the most widely
offspring that reach the age of reproduction) of nat- used selection scheme. This works as follows. Let N be
ural individuals corresponds to the objective function the population
P size, fi the fitness of individual i, and
used to evaluate the quality of the tentative solutions f = N1 f
j j the average population fitness. Then, in
in the computer. The genetic variation processes of fitness proportionate selection, individual i is selected
mutation and recombination are seen as mechanisms for reproduction with a probability:
(search operators) to generate new tentative solutions fi fi
to the problem. Finally, natural selection is interpreted pi = P = .
as a mechanism to promote the diffusion and mixing j f j f N
of the genetic material of individuals representing good In normal GAs populations are not allowed to grow
quality solutions, and, therefore, having the potential or shrink, so N individuals have to be selected for re-
to create even fitter individuals (better solutions). production. Therefore, the expected number of selected
Despite their differences, most EAs have the general copies of each individual is:
form shown in Algorithm 1, although not all the steps in
Algorithm 1 are present in all evolutionary algorithms. Ni = pi N = fi /f.
For example, in modern GAs (Mitchell, 1996) and in So, individuals with an above-average quality (fi > f)
GP phase (a) is part of phases (b) and (c), while phase tend to be selected more than once for mating or
cloning, while individuals below the average tend not
to be used.
Tournament selection, instead, works as follows. To
select an individual, first a group of T (T 2) random
individuals is created. Then the individual with the
highest fitness in the group is selected, the others are
discarded (tournament).
Another alternative is rank selection where individu- Crossover Crossover
Point
als are first sorted (ranked) on the ground of their fit- Point

ness, so that if an individual i has fitness fi > fj than 1010101010 1010101110


its rank is i < j. Then each individual is assigned a
probability of being selected pi taken from a given dis- 1110001110 1110001010
tribution (typically a monotonic
P decreasing function),
with the constraint that i pi = 1.
Parents Offspring
Operators
(a)
Evolutionary algorithms work well only if their genetic
operators allow an efficient and effective search of the Crossover Crossover
Points Points
space of tentative solutions.
One desirable property of recombination operators is 1010101010 1110101110
to guarantee that two parents sharing a useful common
characteristic always transmit such a characteristic to 1110001110 1010001010
their offspring. Another important property is to also
guarantee that different characteristics distinguishing
two parents may be all inherited by their offspring. For Parents Offspring
binary GAs there are many crossover operators with
these properties. (b)
One-point crossover, for example, aligns the two par-
ent chromosomes (bit strings), then cuts them at a ran-
domly chosen common point and exchanges the right- 1010101010
hand-side (or left-hand-side) sub-chromosomes (see Fig- 1110101110
ure 1(a)). In two-point crossover chromosomes are cut 1110001110
at two randomly chosen crossover points and their ends
are swapped (see Figure 1(b)). A more modern opera-
tor, uniform crossover, builds the offspring, one bit at Parents Offspring
a time, by selecting randomly one of the corresponding
bits from the parents (see Figure 1(c)). (c)
Normally, crossover is applied to the individuals of a
population with a constant probability pc (often pc Figure 1: Three crossover operators for binary GAs: (a)
[0.5, 0.8]). Cloning is then applied with a probability 1 one point crossover, (b) two point crossover, (c) uniform
pc to keep the number of individuals in each generation crossover.
constant.
Mutation is the second main genetic operator used in
GAs. A variety of mutation operators exist. Mutation Mutation Mutation
typically consists of making (usually small) alterations Site Site
to the values of one or more genes in a chromosome.
Often mutation is applied to the individuals produced 1010101010 1011101010
by crossover and cloning before they are added to the
new population. In binary chromosomes mutation of-
ten consists of inverting random bits of the genotypes
(see Figure 2). The main goal with which mutation is Figure 2: Bitwise mutation in binary GAs.
applied is the preservation of population diversity, since
diversity prevents the evolutionary search from stagnat-
ing. However, due to its random nature, mutation may
have disruptive effects onto evolution if it occurs too
often. Therefore, in GAs, mutation is usually applied
to genes with a very low probability.
In real-valued GAs chromosomes have the form x =
hx1 , . . . , x` i where each gene xi is represented by a
floating-point number. In these GAs crossover is often
seen as an interpolation process in a multi-dimensional
Euclidean space. So, the components of the offspring
o are calculated from the corresponding components of
the parents p0 and p00 as follows:
oi = p0i + r(p00i p0i )
where r is a random number in the interval [0, 1] (see
Figure 3(a)). Alternatively crossover can be seen as the PARENT 2
exploration of a multi-dimensional hyper-parallelepiped
defined by the parents (see Figure 3(b)), that is the Parameter 1
components oi are chosen uniformly at random within Parameter 2
the intervals
[min(p0i , p00i ), max(p0i , p00i )]. OFFSPRING
Mutation is often seen as the addition of a small random
variation (e.g. Gaussian noise) to a point in a multi- PARENT 1
dimensional space (see Figure 3(c)).
Parameter 3
Other GEC Paradigms
As mentioned before, the principles on which GAs are (a)
based are also shared by many other EAs. However, PARENT 2
the use of different representations and operators has
led to the development of a number of paradigms, each Parameter 1 OFFSPRING
having its own peculiarities. With no pretence of being
exhaustive, in the following we will briefly mention two Parameter 2
important paradigms, other than GAs.
Genetic programming (Koza, 1992; Langdon and
Poli, 2002) is a variant of GA in which the individu-
als being evolved are syntax trees, typically represent- PARENT 1
ing computer programs. The trees are created using
user-defined primitive sets, which typically include in-
Parameter 3
put variables, constants and a variety of functions or
instructions. The syntax trees are manipulated by spe- (b)
cialised forms of crossover and mutation that guaran- MUTATED
tee the syntactic validity of the offspring. The fitness INDIVIDUAL
of the individual trees in the population is evaluated by Parameter 1
running the corresponding programs (typically multiple
Parameter 2 Random
times, for different values of their input variables).
Evolution strategies (Rechenberg, 1973; Schwefel, displacement
1981) are real-valued EAs where mutation is the key
variation operator (unlike GAs where crossover plays INDIVIDUAL
that role). Mutation typically consists of adding zero-
mean Gaussian deviates to the individuals being op-
timised, with the mutations standard deviation being
varied dynamically so as to maximise the performance Parameter 3
of the algorithm. (c)

State of the art Figure 3: Crossover operators (a)(b) and mutation


Popularity of the Field (c) for real-valued GAs.
Among AI and AI-related disciplines, GEC is presently
one of the most active. This is testified, for example,
by:
the numerous dedicated conferences and workshops
(around 20 annual or biannual events) including the
Genetic and Evolutionary Computation Conference
(GECCO), the largest conference in the field, with
around 600 attendees, organised by ACM SigEvo,
and the large IEEE Congress on Evolutionary Com- Evolutionary Combinatorial Optimisation, Hybridis-
putation (CEC); ation, Evolutionary Meta-heuristics, Memetic Algo-
the several dedicated journals including Evolutionary rithms, etc.), and these are also hot theoretical research
Computation from MIT Press (the oldest journal in fields. Finally, a large set of more application-oriented
the field), the Journal on Genetic Programming and topics are popular research areas, such as Evolvable
Evolvable Machines from Kluwer (which is specialised Hardware, Evolutionary Robotics, Evolutionary Image
on Genetic Programming and evolvable hardware), Analysis and Signal Processing and, more generally,
and the IEEE Transactions on Evolutionary Compu- Real-World Applications at large. Some of these areas
tation; have recently grown significantly and can be considered
independent GEC subfields in their own right.
the numerous large bibliographies of GEC liter-
ature, including, for example, the large collec- GEC research in Italy
tion of AI bibliographies in The Collection of
Computer Science Bibliographies available at On a national basis, as it is, unfortunately, not un-
http://liinwww.ira.uka.de/bibliography/Ai/, common, we have an anomalous situation in which the
where the Genetic Programming bibliography main Italian researchers on the field work for foreign
by itself, which covers only papers on that specific institutions. For example, Riccardo Poli (UK) is a
GEC paradigm, is the third largest among the ones world leader in the field of Genetic Programming (being
which are still updated on a regular basis (with 4919 second only to John Koza for number of publications
papers at the time of writing), fifth overall; in this area), Marco Dorigo (Belgium) is the inventor
and world leader of Ant Colony Optimisation, Marco
the constant stream of new books and doctoral theses Tomassini (Switzerland) is a leader on evolutionary al-
on the subject. gorithms and complex systems, etc.
International Situation However, progressively this situation has being bal-
anced by an increasingly active national community, in
It is very difficult to say precisely when and where which more than ten groups are specifically active in the
the field of Genetic and Evolutionary Computation field (at the Universities or Polytechnic Schools of Mi-
was originated. Indeed, especially for the least re- lan, Turin, Parma, Venice, Naples, Salerno, Calabria,
cent evolutionary paradigms, as often happens, seminal Catania, to name a few), which made it possible to or-
ideas can be found in papers by authors who were not ganise successful first edition of GSICE (Italian Work-
the ones who have finally popularised them (see (Fo- shop on Evolutionary Computation) in 2005 in Milan,
gel, 1998) for a comprehensive collection). Consid- which will be followed by the upcoming editions in Siena
ering the latter as the actual fathers of GEC, Ge- (2006) and Catania (2007).
netic Algorithms (Holland, 1975), as well as Genetic The main research topics on which the activity
Programming (Koza, 1992), were first studied and de- of Italian researchers in GEC is focused are: ge-
veloped as independent research topics in the United netic programming theory (R. Poli, Tomassini, Van-
States, while Evolution Strategies (Rechenberg, 1973; neschi), learning classifier systems (Lanzi), ant algo-
Schwefel, 1981), developed in Europe, already contain rithms (Dorigo, Gambardella), particle swarm optimi-
ideas which will be used later in GAs and GP. So, we sation (R. Poli), evolutionary models in artificial life
can say that Germany and the United States, are the (Nicosia, I. Poli), hybrid systems (Tettamanzi), co-
countries where it all started. However, GEC research evolution (Cagnoni, Vanneschi), evolutionary robotics
is today performed in many, many countries worldwide. (Floreano, Nolfi), evolvable hardware (Squillero). The
For example, the two traditional major actors in GEC main application fields are biology (Marchiori), game
(USA and Germany) have now been joined at the top by strategy (Squillero), finance (Tettamanzi), computer vi-
the United Kingdom, where research in AI and related sion and pattern recognition (Cagnoni, Cordella, De
fields has always been extremely active. Falco, R. Poli).
From the point of view of research topics, after the
typical pioneering times of rapid development in which
any empirical study or application of any paradigm to Open problems and research directions
any problem was considered interesting and publishable Despite its increasing degree of maturity, there are still
in its own right, research in GEC has become more ma- many partially unanswered questions which face re-
ture and structured into well defined topics. Such a searchers in GEC. Here we limit ourselves to mention
classification comprises a set of theoretical topics re- only a few of the main open challenges:
garding basic studies on the different GEC paradigms
How do we classify GEC algorithms? When do we ex-
(Genetic Algorithms, Genetic Programming, Evolution
pect the behaviour and performance of two different
Strategies, Evolutionary Programming, Particle Swarm
evolutionary systems to be qualitatively (and maybe
Optimisers, Classifier Systems, Ant Algorithms, Artifi-
at some point quantitatively) similar and why?
cial Immune Systems, etc.). Other research areas span
more than one of these fundamental paradigms (Co- How do we classify problems? When do we expect
evolution, Evolutionary Multi-Objective Optimisation, the performance of a particular algorithm on two dif-
ferent problems to be the substantially the same and human biases when assessing their intelligence. This
why? led him to propose an imitation game, now known as
Although the development of mathematical models the Turing test for machine intelligence. Unfortunately,
of evolutionary equations has been rapid there re- the Turing test is not usable in practice, and so, it has
mains a disquieting lack of tools with which we ob- become clear that there is a need for more workable
tain solutions to evolution equations. In addition, objective tests for progress.
these models have immense numbers of degrees of John Koza (Koza et al., 1999) recently proposed to
freedom, which makes them hard to simulate even shift the attention from the notion of intelligence to the
for the most powerful computers. So, mathematical notion of human competitiveness. An automatically-
models can shed only some light on EA dynamics. created result is considered human-competitive if it
satisfies at least one of the eight criteria below:
How can we develop models of EAs that can provide
theoretically-sound recipes for practitioners, such as 1. The result was patented as an invention in the past, is
which operators, fitness function, search algorithm, an improvement over a patented invention, or would
population size, number of generations, number of qualify today as a patentable new invention.
runs, crossover probability, etc. one should use for a 2. The result is equal to or better than a result that was
given problem or a given class of problems. accepted as a new scientific result at the time when
it was published in a peer-reviewed scientific journal.
Interactions with other AI disciplines 3. The result is equal to or better than a result that
and other research areas was placed into a database or archive of results main-
GEC is intrinsically a transversal field of research, since, tained by an internationally recognised panel of sci-
on the one side, its techniques are based on biological entific experts.
models, but also, on the other side, since it provides 4. The result is publishable in its own right as a new sci-
a set of tools which can be effectively applied to other entific result, independent of the fact that the result
disciplines. Quite naturally there are several examples was mechanically created.
of synergetic applications (Tettamanzi and Tomassini,
2001) of GEC techniques along with other techniques 5. The result is equal to or better than the most recent
which are comprised in the set of disciplines (Neural human-created solution to a long-standing problem
Networks, Fuzzy Sets, Probabilistic Networks) usually for which there has been a succession of increasingly
termed as Computational Intelligence (CI). better human-created solutions.
CI is considered by many the new modern AI. How- 6. The result is equal to or better than a result that was
ever, although some GEC researchers might object to considered an achievement in its field at the time it
this viewpoint, we believe that most of the work cur- was first discovered.
rently going on in GEC can be seen as AI. In particular,
when EAs are successfully used in practical applica- 7. The result solves a problem of indisputable difficulty
tions, in many cases a lot of the credit for the success in its field.
goes the specialised representation, operators and fit- 8. The result holds its own or wins a regulated com-
ness function designed to tackle the problem, rather petition involving human contestants (in the form of
than the fact that this or that EA was used to perform either live human players or human-written computer
the search. That is, the search is often successful thanks programs).
to the good knowledge engineering efforts of the users
Over the years, a list of tens of results have passed
of the EA. This should not come as a surprise: after
the human-competitiveness test (see (Koza and Poli,
all AI people has always known that a good represen-
2005) for a recent list). Also, since 2004, a competi-
tation, good expansion operators and good heuristics
tion is held annually at GECCO (termed the Human-
can tame search and avoid the problems inherent in ex-
Competitive awards - the Humies,). The prize ($
ponentially large search spaces. GEC researchers, how-
10,000) is awarded to automatically-created applica-
ever, have started accepting these good-old-fashioned
tions which have produced results which are equivalent
AI guidelines after painfully digesting a now-famous re-
to human achievements or, better, are unpaired by hu-
sult that goes under the name of No-free Lunch Theo-
mans. The Gold Prizes in 2004 and 2005 were awarded
rem for search (Wolpert and Macready, 1997).
to applications of GEC to high-tech fields such as an
antenna for deployment on NASAs Space Technology
Applications 5 Mission, automatic quantum computer programming,
Getting machines to produce human-like results is the two-dimensional photonic crystals design, applications
reason for the existence of AI and machine learning. to attosecond dynamics of high-harmonic generation,
However, it has always been very difficult to assess how shaped-pulse optimisation of coherent soft-x-rays. The
much progress these fields have made towards their ulti- 2006 competition is still to be held at the time of writ-
mate goal. Turing understood the need to evaluate ob- ing.
jectively the behaviour exhibited by machines, to avoid Some pre-2004 human-competitive results include:
Creation of a better-than-classical quantum algo- supercomputer Koza used to obtain his human compet-
rithm for Grovers database search problem itive results (new inventions). These were produced in
Creation of a quantum algorithm for the depth-two approx 1 week of computer time.
AND/OR query problem that is better than any pre- It follows from this that in 2012 it will be possible to
viously published result produce patentable new inventions in a day on a 96 node
Beowulf workstation! So, by 2012 it is not unthinkable
Creation of a soccer-playing program that won its that EAs will be used routinely as invention machines,
first two games in the Robo Cup 1997 competition design machines, optimisers and problem solvers.
Creation of four different algorithms for the trans- So, GEC has effectively started fulfilling the AI
membrane segment identification problem for pro- dream by providing us with a systematic method, based
teins on Darwinian evolution, for getting computers to auto-
Creation of a sorting network for seven items using matically solve difficult problems for us. To do so, EAs
only 16 steps simply require a high-level statement of what needs to
be done (and enough computing power). Today GEC
Synthesis of 60 and 96 decibel amplifiers certainly cannot produce computers that would pass
Synthesis of analog computational circuits for squar- the full Turing test for machine intelligence, but GEC
ing, cubing, square root, cube root, logarithm, and has been able to solve tens of difficult problems with
Gaussian functions human-competitive results, and we should expect to see
Synthesis of a real-time analog circuit for time- this trend accelerate.
optimal control of a robot These are small steps towards fulfilling the founders
of AI dreams, but they are also early signs of things
Synthesis of an electronic thermometer to come. By 2012 EAs and AI techniques will be able
Creation of a cellular automata rule for the majority to routinely and competently solve important problems
classification problem that is better than the Gacs- for us in a variety of specific domains of application,
Kurdyumov-Levin (GKL) rule and all other known becoming essential collaborators for many of human ac-
rules written by humans tivities.
Synthesis of topology for a PID-D2 (proportional, This will be a remarkable step forward towards
achieving true, human-competitive machine intelli-
integrative, derivative, and second derivative) con-
troller gence.
Synthesis of NAND circuit References
Simultaneous synthesis of topology, sizing, place- Fogel, D. B., editor (1998). Evolutionary Computa-
ment, and routing of analog electrical circuits tion. The Fossil Record. Selected Readings on the His-
Synthesis of topology for a PID (proportional, inte- tory of Evolutionary Computation. IEEE Press.
grative, and derivative) controller Holland, J. (1975). Adaptation in Natural and Ar-
Synthesis of a voltage-current conversion circuit tificial Systems. University of Michigan Press, Ann
Arbor, USA.
Creation of PID tuning rules that outperform the
Ziegler-Nichols and Astrom-Hagglund tuning rules Koza, J. R. (1992). Genetic Programming: On the
Programming of Computers by Natural Selection. MIT
Creation of three non-PID controllers that outper- Press, Cambridge, MA, USA.
form a PID controller that uses the Ziegler-Nichols
or Astrom-Hagglund tuning rules Koza, J. R., Bennett III, F. H., and Stiffelman, O.
(1999). Genetic programming as a Darwinian inven-
Even from the short list provided, which is far from tion machine. In Poli, R., Nordin, P., Langdon, W. B.,
being exhaustive, nor sufficient to fully describe their and Fogarty, T. C., editors, Genetic Programming,
potential, the use of GEC techniques as invention ma- Proceedings of EuroGP99, volume 1598 of LNCS,
chines appears to be effective, and subject to further pages 93108, Goteborg, Sweden. Springer-Verlag.
increase in competitiveness in a future, which the fol-
Koza, J. R. and Poli, R. (2005). Genetic program-
lowing closing section tries to forecast.
ming. In Burke, E. K. and Kendall, G., editors, Search
Methodologies: Introductory Tutorials in Optimization
Conclusions and Decision Support Techniques, chapter 5. Springer.
Let try to look a bit ahead into a not too distant future,
Langdon, W. B. and Poli, R. (2002). Foundations of
say 2012, and let us make some predictions.
Genetic Programming. Springer-Verlag.
We start from some simple back-of-an-envelope cal-
culations. In 2012 CPUs will be 25 times faster than Mitchell, M. (1996). An introduction to genetic algo-
today. A Beowulf desktop computer (12 CPUs) will rithms. Cambridge MA: MIT Press.
be 300 times faster than current PCs. A Beowulf tower Rechenberg, I. (1973). Evolutionsstrategie: Opti-
computer (96 CPUs) will be 2400 times faster than cur- mierung technischer Systeme nach Prinzipien der biol-
rent PCs and approx 10 times faster than the 1000 node ogischen Evolution. FrommannHolzboog, Stuttgart.
Schwefel, H.-P. (1981). Numerical Optimization of Proceedings of the Genetic and Evolutionary Com-
Computer Models. Wiley, Chichester. putation Conference.
Tettamanzi, A. and Tomassini, M. (2001). Soft Com- Born in 1999 from the recombination of the In-
puting: Integrating Evolutionary, Neural and Fuzzy ternational Conference on Genetic Algorithms and
Systems. Springer, Berlin, Heidelberg, New York, New the Genetic Programming Conference, GECCO is the
York. largest conference in the field.
Wolpert, D. H. and Macready, W. G. (1997). No free Proceedings of the Foundations of Genetic Algo-
lunch theorems for optimization. IEEE Transactions rithms (FOGA) workshop.
on Evolutionary Computation, 1(1):6782. FOGA is a biannual, small but very-prestigious and
highly-selective workshop. It is mainly devoted to
Pointers to Further Reading in GEC the theoretical foundations of EAs.
David E. Goldberg. Genetic Algorithms in Search, Proceedings of the Congress on Evolutionary Com-
Optimization, and Machine Learning. Addison- putation (CEC).
Wesley, Reading, Massachusetts, 1989. CEC is large conference under the patronage of IEEE.
A classic book on genetic algorithms and classifier Proceedings of Parallel Problem Solving from Nature
systems. (PPSN).
David E. Goldberg. The Design of Innovation: This is a large biannual European conference, prob-
Lessons from and for Competent Genetic Algorithms. ably the oldest of its kind in Europe.
Kluwer Academic Publishers, Boston, 2002. Proceedings of the European Conference on Genetic
An excellent, long-awaited follow up of Goldbergs Programming.
first book. EuroGP was the first European event entirely de-
Melanie Mitchell, An introduction to genetic algo- voted to Genetic Programming. Run as a workshop
rithms, A Bradford Book, MIT Press, Cambridge, in 1998 and 1999, it became a conference in 2000. It
MA, 1996. has now reached its tenth edition.
A more modern introduction to genetic algorithms.
John H. Holland, Adaptation in Natural and Artifi- Some Web Resources
cial Systems, second edition, A Bradford Book, MIT http://www.genetic-programming.com/ and
Press, Cambridge, MA, 1992. http://www.genetic-programming.org/
Second edition of a classic from the inventor of ge- Genetic Programming Inc.,
netic algorithms.
http://www.cs.bham.ac.uk/~wbl/biblio/
Thomas Back and Hans-Paul Schwefel. An overview README.html
of evolutionary algorithms for parameter optimiza- http://liinwww.ira.uka.de/bibliography/
tion. Evolutionary Computation, 1(1):123, 1993. Ai/genetic.programming.html
A good introduction to parameter optimisation using GP bibliography maintained by W. B. Langdon
EAs.
http://www.geneticprogramming.com/
T. Back, D. B. Fogel and T. Michalewicz, Evolution- The Genetic Programming Notebook
ary Computation 1: Basic Algorithms and Operators,
Institute of Physics Publishing, 2000. http://evonet.lri.fr/CIRCUS2/node.php?node=1
A modern introduction to evolutionary algorithms. The EvoNet online tutorial
Good both for novices and more expert readers.
John R. Koza. Genetic Programming: On the Pro-
gramming of Computers by Means of Natural Selec-
tion. MIT Press, 1992.
The bible of genetic programming by the founder of
the field. Followed by GP II (1994), GP III (1999)
and GP IV (forthcoming).
Wolfgang Banzhaf, Peter Nordin, Robert E. Keller
and Frank D. Francone, Genetic Programming An
Introduction; On the Automatic Evolution of Com-
puter Programs and its Applications, Morgan Kauf-
mann, 1998.
An excellent textbook on GP.
W. B. Langdon and Riccardo Poli, Foundations of
Genetic Programming, Springer, Feb 2002.
The only book entirely devoted to the theory of GP
and its relations with the GA theory.

You might also like