Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

NAIS: A Calibrated Immune Inspired Algorithm To Solve Binary Constraint Satisfaction Problems

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

NAIS: A Calibrated Immune Inspired Algorithm

to solve Binary Constraint Satisfaction


Problems?

Marcos Zuñiga1 , Marı́a-Cristina Riff2 and Elizabeth Montero2


1
Projet ORION, INRIA Sophia-Antipolis
Nice, France
e-mail: Marcos.Zuniga@sophia.inria.fr
2
Department of Computer Science, Universidad Técnica Federico Santa Marı́a,
Valparaı́so, Chile,
e-mail:Marı́a-Cristina.Riff@inf.utfsm.cl, Elizabeth.Montero@inf.utfsm.cl

Abstract We propose in this paper an artificial immune system to solve


CSPs. The algorithm has been designed following the framework pro-
posed by de Castro and Timmis. We have calibrated our algorithm using
Relevance Estimation and Value Calibration (REVAC), that is a new
technique, recently introduced to find the parameter values for evolu-
tionary algorithms. The tests were carried out using random generated
binary constraint satisfaction problems on the transition phase where are
the hardest problems. The algorithm shown to be able to find quickly
good quality solutions.

1 Introduction

Constraint satisfaction problems (CSPs) involve finding values for problem vari-
ables subject to constraints on which combinations are acceptable. Over the
few years, many algorithms and heuristics were developed to find a solution of
CSPs. Following these trends from the constraint research community in the bio-
inspired computation community, some approaches have also been proposed to
tackle CSP with success such that evolutionary algorithms [4], [8], [10], [12], [11],
[14], ants algorithms [13]. Given that recent publications indicate that artificial
immune systems offer advantages in solving complex problems [1], [3], our goal
here is to propose an efficient immune inspired algorithm which can solve CSPs.
Immune artificial systems as well as evolutionary algorithms are very sensitive
to the values of their parameters. Garret in [18] proposed a parameter-free clonal
selection using adaptive changes.
In this paper, we focalize our attention in a new method proposed for tuning,
that is a method that uses statistical properties to determine the best set of
parameter values for an evoltuionary algorithm.
The contributions of this paper are:
?
Supported by Fondecyt Project 1060377
– An immune inspired algorithm which can solve hard CSPs,
– A new application of the tuning method Relevance Estimation and Value
Calibration (REVAC) proposed for evolutionary algorithms, [15],
The paper is structured as follows. In the next section, we define the Con-
straint Satisfaction Problem. In section 3 we introduce our new approach NAIS.
The results of tests and a comparison with other incomplete method are given
in section 4. In our summary, we give some conclusions and future works.

2 Binary Constraint Satisfaction Problems


For simplicity we restrict our attention here to binary CSPs, where the con-
straints involve two variables. Binary constraints are binary relations. If a vari-
able i has a domain of potential values Di and a variable j has a domain of
potential values Dj , the constraint on i and j, Rij , is a subset of the Cartesian
product of Di and Dj . A pair of values (a, b) is called consistent, if (a, b) satisfies
the constraint Rij between i and j. The variables i and j are the relevant vari-
ables for Rij . The constraint network is composed of the variables, the domains
and the constraints. Thus, the problem is [9], [12] given a set of variables, a
domain of possible values for each variable, and a conjunction of constraints, to
find a consistent assignment of values to the variables so that all the constraints
are satisfied simultaneously. CSP’s are, in general, NP-complete problems and
some are NP-hard [7]. Thus, a general algorithm designed to solve any CSP will
necessarily require exponential time in problem size in the worst case.

3 NAIS: Network Artificial Immune System


We called our algoritm NAIS which stands for Network Artificial Immune Sys-
tem. The algorithm uses three immune components: antigen, antibody and B-
cells. Basically, the antigen represents the information for each variable given by
the constraint graph. This information is related to the number of connections of
each variable, that is the number of constraints where each variable is a relevant
one. Thus, it is a fixed information and not depends on the state of the search
of the algorithm. On the contrary, the antibody strongly depends on the state of
the search of the algorithm. It has two kinds of information: the variable values
and the constraints violated under this instantiation. Finally, a B-cell has all the
antibody information required by the algorithm to its evolution.

3.1 Immune Components for CSP


The immune components in our approach are defined as follows:
Definition 1. (Antigen)
For a CSP and its constraint graph we define the antigen Ag of the n-tuple of
variables (Ag1 , . . . , Agn ), such that the Agi value is the number of constraints
where Xi is a relevant variable, ∀i, i = 1, . . . , n.
The algorithm needs to know for each pre-solution its variable values and the
constraints satisfied under this instantiation. For this reason, the antibody has
two segments: a structural and a conflicting segment.

Definition 2. (Structural antibody)


A structural antibody Abs is a mapping from a n-tuple of variables (X1 , . . . , Xn ) →
D1 × . . .× Dn , such that it assigns a value from its domain to each variable in V.
Remark: The structural segment corresponds to an instantiation I of the CSP.

Definition 3. (Conflicting antibody)


For a CSP and its constraint graph we define the conflicting antibody Ab c of the
n-tuple of variables (Abc1 , . . . , Abcn ), such that the Abci value is the number of
violated constraints where Xi is a relevant variable, ∀i, i = 1, . . . , n.

A solution consists of a structural antibody which does not violate any con-
straint, that is, whose conflicting antibody complements the antigen.
Before defining the B-cell we need to introduce the idea of affinity in the context
of our problem.

3.2 Affinity measure


In our approach we are interested in two kinds of affinity. The affinity between
the antigen and a conflicting antibody, and the affinity between two structural
antibodies.

– Affinity between the antigen and a conflicting antibody


It is an estimation of how far an antibody is from being a CSP solution.
It is related to the number of satisfied constraints by an antibody. The key
idea is that a solution of the CSP corresponds to the biggest value of the
affinity function between Abc and Ag. This occurs when all the constraints
are satisfied. We define the function Acsp to measure this affinity as the
euclidean distance computed by:

v
u n
uX
Acsp (Ag, Abc ) = t (Agi − Abci )2 (1)
i=1

The function Acsp prefers a pre-solution with a minimal number of violated


constraints as it is usual for guiding the search of incomplete techniques.

– Affinity between two structural antibodies


Two structural antibodies has a high affinity level when they are quite similar
in terms of the values of these variables. The idea of using this measure,
named HAs , is to quantify how similar two pre-solutions are. To compute
this interaction our algorithm uses the Hamming distance. The algorithm
uses this measure to obtain a diversity of pre-solutions.
3.3 B-cell Representation

A B-cell is a structure with the following components:

– An Antibody Ab = (Abc , Abs )


– The number of clones of Ab to be generated for the clonal expansion proce-
dure. This number is directly proportional to the A csp value.
– The hypermutation ratio used in the affinity maturation step. This ratio is
inversely proportional to the Acsp value.

3.4 The Algorithm - Network Artificial Immune System

The algorithm NAIS is shown in figure 1. NAIS works with a set of B-cells, fol-
lowing an iterative maturation process. Some of these B-cells are selected, doing
a clonal selection, prefering those with bigger affinity values A csp , that is, those
that satisfy a greater number of constraints. It uses a Roulette Wheel selection,
[17]. The algorithm generates a number of clones of the B-cells selected, that is
done by the clonal expansion procedure. These clones follow a hypermutation
process in the affinity maturation step.
The new set of B-cells is composed of a selected set of hypermutated B-cells.

Algorithm NAIS(CSP) returns memory B-cells


Begin
Ag ← Determine constraint graph connections(CSP, n);
Initialize B-cells
For i ← 1 to B − cells N U M do
Compute affinity value Acsp (B-cells[i])
End For
j ← 1;
While (j ≤ M AX IT ER) or (not solution) do
Select a set of B-cells by Roulette Wheel
Generate Clones of the selected B-cells
Hypermutate Clones
For k ← 1 to CLON ES N U M do
Compute affinity value Acsp (Clones[k])
End For
B-cells ← build network(CLONES);
B-cells ← metadynamics(B-cells);
End While
Return B-cells;
End
Figure1. NAIS Pseudocode

This selection is done in the build network using the HA s values in order to
have a diversity of B-cells. A hypermutated B-cell could belong to the new set of
B-cells if the hypermutated B-cell is quite different from the B-cells in memory.
This difference is measured using the hamming distance, controlling the minimal
degree of required diversity by the  parameter value. Thus, a B-cell be accepted
to be a new memory member when (1 − HA n ) > . The  value is known as the
s

threshold of cross reactivity.


Finally, the algorithm adds new B-cells randomly generated in the metadynam-
ics procedure to this set of B-cells. This procedure allows the algorithm to do
more exploration of the search space.

Hypermutation procedure: The hypermutation is a hill-climbing procedure


that repairs the conflicts in the conflicting antibody. This procedure is inspired
on min-conflicts algorithm proposed in [16]. Figure 2 shows the hypermutation
procedure.

Hypermutation(B-cell)
Begin
Repeat
V = Select randomly a variable to be changed
If Abc (V ) > 0 then
Repeat
Choose v a new value for V from its domain
N Abc (V ) = Number of conflicts for V using v
If N Abc (V ) < Abc (V ) then
Abs (V ) = v
Re-compute Abc
End If
Until (N Abc (V ) < Abc (V )) or (Max tries)
End If
Until Num hyper
returns(B-cell)
End

Figure2. Hypermutation Procedure


Given a B-cell, a variable of its structural antibody is randomly selected to
be changed. In case of the selected variable does not participate in any constraint
violation (i.e. Abc (V ) = 0), the procedure tries to modify another variable that
could be involved in a conflict. The value of this variable is modified, such that,
this value allows to reduce the number of conflicts recorded on the corresponding
conflicting antibody. Thus, this procedure changes the structural antibody and
also, as a consequence, the conflicting antibody.
The procedure Re-compute(Abc ) does a partial evaluation of the conflicting an-
tibody, only considering the variables related to the variable V . That is, just
those variables which are relevants with V for a specific constraint. This kind of
partial evaluation is quite useful in the constraint research community in order
to reduce the computational time spent evaluating the constraints satisfaction
for a new instantiation of the variables.
The hypermutation procedure uses two parameters: Max tries and Num iter.
The parameter Max tries corresponds to the maximum number of values to be
tried for a given variable V . The parameter Num iter corresponds to the maxi-
mum number of the B-cell variables that could be mutated.

4 Tests

The goal of the following benchmarks is to evaluate the performance of NAIS


for solving CSP. The algorithm has been tested with randomly generated binary
CSPs, [5]. The tests are to evaluate the NAIS behaviour when it is calibrated
using the technique REVAC for tuning. We compare NAIS calibrated with GSA
[4] that is a sophisticated evolutionary algorithm that solves CSPs and which
strongly uses knowledge coming from the constraints research community. We
also compare NAIS with SAW, [14]. SAW has been compared with both complete
and incomplete well-known algorithms for CSP obtaining better results in most
of the cases tested.
The hardware platform for the experiments was a PC Pentium IV Dual Core,
3.4Ghz with 512 MB RAM under the Mandriva 2006 operating system. The
algorithm has been implemented in C. The code for NAIS is available in a web
page1 .

4.1 Problems tested on the hard zone

The idea of these tests is to study the behavior of the algorithm solving hard
problems. We use two models to generate binary CSPs. That is because GSA
has been reported using model B proposed in [5] and SAW has been reported
using model E, [14].

Model B: The binary CSPs belonging to the hard zone are randomly generated
using the model proposed by B. Smith in [5]. This model considers four param-
eters to obtain a CSP. That is, the number of variables (n), the domain size for
each variable (m), the probability that exists a constraint between two variables
(p1 ), and the probability of compatibility values(p 2 ). This model exactly deter-
mines the number of constraints and the number of consistent instantiations for
the variables that are relevant for a given constraint. Thus, for each set of prob-
lems randomly generated the number of constraints are p1 n(n−1)2 and for a given
constraint the number of consistent instantiations are m 2 p2 . Given (n, m, p1 ) B.
Smith defines a function to compute critical p2 values, those values that allow
to obtain CSPs on the transition phase, that is the problems that are harder to
be solved.
1
http://www-sop.inria.fr/orion/personnel/Marcos.Zuniga/CSPsolver.zip
2
− (n−1)p
p̂2 (n, m, p1 ) = m 1 (2)
crit

Model E: This model also considers four parameters (n, m, p, k). The param-
eters n and m have the same interpretation than in model B. For binary CSPs
whose constraints have two relevant variable k = 2 in model E. The higher p the
more difficult, on average, problem instances will be.

4.2 REVAC
The Relevance Estimation and Value Calibration has recently been proposed in
[15]. The goal of this algorithm is to determine the parameter values for evolu-
tionary algorithms. It is also able to identify which operators are not relevant
for the algorithm. Roughly speaking, REVAC is a steady-state evolutionary al-
gorithm that uses a real-value representation, where each value corresponds to a
parameter value of the algorithm to be tunned. Each chromosome in REVAC is
evaluated by the performance obtained by the algorithm (to be tunned) using its
parameter values. A new individual is randomly created, however the value for
each variable is obtained only considering the values that are in the population
for this variable. It does 1000 evaluations.
In order to apply REVAC for calibrating NAIS, we have selected 14 problems,
two from each category < 10, 10, p1 , p2 > using model B. The performance for
each chromosome is computed as the number of satisfied constraints by the so-
lution obtained by NAIS using these parameter values. The parameter values
found by this tuning procedure were:
– n1 = 0.3, rate of cells to be expanded,
– n2 = 0.4, rate of cells to be incorporated on the memory
–  = 0.40, threshold reactivity between clones
– B − cells = 5
– Number of clones = 100
That means that NAIS requires to do more exploration than it does using a
hand-made calibration. In the hand-made calibration the hypermutated cell is
accepted if it differs at least in a 54% ( = 0.46) from the memory cells. Now, it
must differ at least in a 60% to be accepted. Furthermore, the number of cells to
be expanded has been reduced in 0.2. The procedure required around 14 hours,
computational time, to determine these parameter values.

4.3 Tests with Calibration


Comparison between NAIS and GSA using Model B: Because the re-
ported results of GSA, [4] has been evaluated with the problems in the hardest
zone as they have been generated in [5], we run the calibrated NAIS using prob-
lems generated using the parameters (n, m, p1 , p2 ). We consider the problems
with n = m, where n = 10. The p1 and p2 values are those belonging to the
hardest zone. In figure 3 c0.3 t0.7 means p1 = 0.3 and p2 = 0.7. The following
table shows the percentage of problems solved and the time required for GSA
and those obtained by NAIS considering 10.000, 50.000 and 75.000 evaluations.

GSA NAIS
Category
50.000 ev. time [s] 10.000 ev. time [s] 50.000 ev. time [s] 75.000 ev. time [s]
c0.3 t0.7 93.33 2.18 87.5 0.68 93.75 1.85 97.5 1.98
c0.5 t0.5 84.4 2.82 91 0.61 98.33 1.08 99.33 0.98
c0.5 t0.7 100 2.76 84 0.61 83.33 2.79 84.33 3.91
c0.7 t0.5 16.7 10.35 87 0.77 87 2.09 90.67 3.34
c0.7 t0.7 100 2.51 80.67 0.66 80.33 2.15 82 4.52
c0.9 t0.5 3.3 13.77 83.67 0.52 86.33 2.15 87.33 4.19
c0.9 t0.7 99.0 1.58 75.33 0.82 75.33 3.92 75.67 6.03

Figure3. Success rate and CPU time for NAIS and GSA

NAIS has a higher satisfaction rate than GSA, moreover it converges very quickly
to good solutions. Furthermore, considering just 10.000 evaluations the average
succes rate for NAIS was around 83% instead of 72% for GSA. However, in some
categories GSA outperforms NAIS.

Comparison between NAIS and SAW using Model E: We have gener-


ated 250 problem instances in the hard zone using Model E. Figure 4 shows the
success rate and the time in seconds for both algorithms.

SAW NAIS
p 100000 ev. 10000 it. 50000 it. 75000 it.
succes time [s] succes time [s] succes time [s] succes time [s]
0.24 100 0.74 100 0.32 100 0.33 100 0.3
0.25 100 2.33 100 0.44 100 0.43 100 0.4
0.26 97 6.83 100 0.56 100 0.6 100 0.61
0.27 60 11.39 100 1.2 100 1.1 100 1
0.28 25 18.66 98.4 2.06 100 2.26 100 2.05
0.29 17 20.57 84 4.11 99.6 5.24 100 5.41
0.3 5 22.27 47.6 6.99 84.4 17.17 90 21.02
0.31 1 22.47 16.8 8.62 38.4 35.1 46.8 48.91
0.32 0 22.39 24 8.22 59.6 27.64 63.6 37.95
0.33 1 22.38 24.4 8.3 59.6 27.4 68.4 34.85

Figure4. Success rate and CPU time for NAIS and SAW

We can observe that NAIS outperforms SAW in both time and success rate.
Moreover, the average success rate for SAW is 40.6% instead of a 69.2% for
NAIS, just considering 10.000 iterations. NAIS required, for these number of
iterations, in average, just 4.1 seconds.
Figure 5 shows the results for NAIS and SAW. We can observe in NAIS the
transition phase in p = 0.31.

Comparison between NAIS and SAW


100
NAIS − 10.000 it.
NAIS − 50.000 it.
NAIS − 75.000 it.
SAW − 100.000 ev.

80

60

40

20

0
0.23 0.24 0.25 0.26 0.27 0.28 0.29 0.3 0.31 0.32 0.33 0.34

Figure5. Different problems tested, comparison of % Succesful runs

5 Conclusions

The artificial immune systems have some interesting characteristics from the
computational point of view: pattern recognition, affinity evaluation, immune
networks and diversity. All of these characteristics have been included in our
algorithm. The B-cell structure is useful to determine both the solution of the
problems and also to identify conflicts. The conflicting antibody is used by the
algorithm to guide the reparation of the solutions (hypermutation process), giv-
ing more priority to the variables involved in a higher number of conflicts. For
the problems in the hardest zone NAIS just using 10.000 iterations (avg. 4.1 sec-
onds) solved, on average, 28% more problems than SAW, one of the best known
evolutionary algorithm. The calibrated NAIS solved more problems than GSA,
that is a sophisticated genetic algorithm which incorporated many constraints
concepts to solve CSP. Artificial Immune Systems is a promising technique to
solve constrained combinatorial problems.

6 Future Work

A promising research area is to incorporate some parameter control strategies


into the algorithm. The tuning process to define the parameter values for NAIS
has been a time consuming task.
References
[1] de Castro L.N. and Timmis J., Artificial Immune Systems: A New Computational
Intelligence Approach, Ed. Springer, 2002.
[2] de Castro L.N. and von Zuben F.J., Learning and Optimization Using the Clonal
Selection Principle, IEEE Transactions On Evolutionary Computing, Vol. 6, No.
3, pp. 239-251, Junio 2002.
[3] Dasgupta D., Artificial Immune Systems And Their Applications, Springer-Verlag,
2000.
[4] Dozier G., Bowen J. and Homaifar A., Solving Constraint Satisfaction Problems
Using Hybrid Evolutionary Search, IEEE Transactions on Evolutionary Computa-
tion, Vol. 2, No. 1, pp. 23-33, Abril 1998.
[5] Smith B. and M. E. Dyer M.E., Locating the phase transition in constraint satis-
faction problems, Artificial Intelligence, 81, pp. 155-181, 1996.
[6] Timmis J. and Neal M., Investigating the Evolution and Stability of a Resource
Limited Artificial Immune System, Proceedings of the IEEE Brazilian Symposium
on Artificial Neural Networks, pp. 84-89, 2000.
[7] Cheeseman P.,Kanefsky B. and Taylor W., Where the Really Hard Problems Are.
Proceedings of IJCAI-91, pp. 163-169, 1991.
[8] Eiben A.E., van Hemert J.I., Marchiori E. and Steenbeek A.G., Solving Binary
Constraint Satisfaction Problems using Evolutionary Algorithms with an Adap-
tive Fitness Function. Proceedings of the 5th International Conference on Parallel
Problem Solving from Nature ( PPSN-V), LNCS 1498, pp. 196-205, 1998.
[9] Mackworth A.K., Consistency in network of relations. Artificial Intelligence, 8:99-
118, 1977.
[10] Marchiori E., Combining Constraint Processing and Genetic Algorithms for Con-
straint Satisfaction Problems. Proceedings of the 7th International Conference on
Genetic Algorithms ( ICGA97), 1997.
[11] Riff M.-C., A network-based adaptive evolutionary algorithm for CSP, In the book
“Metaheuristics: Advances and Trends in Local Search Paradigms for Optimisa-
tion”, Kluwer Academic Publisher, Chapter 22, pp. 325-339, 1998.
[12] Tsang, E.P.K., Wang, C.J., Davenport, A., Voudouris, C. and Lau,T.L., A family
of stochastic methods for constraint satisfaction and optimization, Proceedings
of the 1st International Conference on The Practical Application of Constraint
Technologies and Logic Programming (PACLP), London, pp. 359-383, 1999.
[13] Solnon C., Ants can solve Constraint Satisfaction Problems, IEEE Transactions
on Evolutionary Computation, 6(4), pages 347-357, 2002.
[14] Craenen B., Eiben A.E., and van Hemert J.I., Comparing Evolutionary Algorithms
on Binary Constraint Satisfaction Problems, IEEE Transactions on Evolutionary
Computation, 7(5):424-444, 2003.
[15] Nannen V. and Eiben A.E., Relevance Estimation and Value Calibration of Evolu-
tionary Algorithm Parameters. Proceedings of the Joint International Conference
for Artificial Intelligence (IJCAI), pp. 975-980, 2007.
[16] Minton S., Johnston M., Philips A. and Laird P., Minimizing conflicts: a heuris-
tic repair method for constraint satisfaction and scheduling problems, Artificial
Intelligence, Vol. 58, pp. 161-205, 1992.
[17] Michalewicz Z., Genetic Algorithms + Data Structures = Evolution Programs,
Springer, 1992.
[18] Garret S., Parameter-free adaptive clonal selection, IEEE Congress on Evolution-
ary Computation, 2004.

You might also like