Make 01 00010 v2
Make 01 00010 v2
Make 01 00010 v2
knowledge extraction
Review
Particle Swarm Optimization: A Survey of Historical
and Recent Developments with
Hybridization Perspectives
Saptarshi Sengupta * , Sanchita Basak and Richard Alan Peters II
Department of Electrical Engineering and Computer Science, Vanderbilt University, 2201 West End Ave,
Nashville, TN 37235, USA; sanchita.basak@vanderbilt.edu (S.B.); alan.peters@vanderbilt.edu (R.A.P.)
* Correspondence: saptarshi.sengupta@vanderbilt.edu; Tel.: +1-615-678-3419
Received: 1 September 2018; Accepted: 4 October 2018; Published: 10 October 2018
Abstract: Particle Swarm Optimization (PSO) is a metaheuristic global optimization paradigm that
has gained prominence in the last two decades due to its ease of application in unsupervised,
complex multidimensional problems that cannot be solved using traditional deterministic algorithms.
The canonical particle swarm optimizer is based on the flocking behavior and social co-operation
of birds and fish schools and draws heavily from the evolutionary behavior of these organisms.
This paper serves to provide a thorough survey of the PSO algorithm with special emphasis on the
development, deployment, and improvements of its most basic as well as some of the very recent
state-of-the-art implementations. Concepts and directions on choosing the inertia weight, constriction
factor, cognition and social weights and perspectives on convergence, parallelization, elitism, niching
and discrete optimization as well as neighborhood topologies are outlined. Hybridization attempts
with other evolutionary and swarm paradigms in selected applications are covered and an up-to-date
review is put forward for the interested reader.
1. Introduction
The last two decades have seen unprecedented development in the field of Computational
Intelligence with the advent of parallel processing capabilities and the introduction of several powerful
optimization algorithms that make little or no assumption about the nature of the problem. Particle
Swarm Optimization (PSO) is one among many such techniques and has been widely used in
treating ill-structured continuous/discrete, constrained as well as unconstrained function optimization
problems [1]. Much like popular Evolutionary Computing paradigms such as Genetic Algorithms [2]
and Differential Evolution [3], the inner workings of the PSO make sufficient use of probabilistic
transition rules to make parallel searches of the solution hyperspace without explicit assumption
of derivative information. The underlying physical model upon which the transition rules are
based is one of emergent collective behavior arising out of social interaction of flocks of birds
and schools of fish. Since its inception in 1995, PSO has found use in an ever-increasing array of
complex, real-world optimization problems where conventional approaches either fail or render
limited usefulness. Its intuitively simple representation and relatively low number of adjustable
parameters make it a popular choice for many problems which require approximate solutions up
to a certain degree. There are however, several major shortcomings of the basic PSO that introduce
failure modes such as stagnation and convergence to local optima which has led to extensive studies
(such as [4,5]) aimed at mitigation and resolution of the same. In this review, the foundations and
frontiers of advances in PSO have been reported with a thrust on significant developments over the
last decade. The remainder of the paper is organized sequentially as follows: Section 2 provides a
historical overview and motivation for the Particle Swarm Optimization algorithm, Section 3 outlines
the working mechanism of PSO, and Section 4 details perspectives on historical and recent advances
along with a broad survey of hybridization approaches with other well-known evolutionary algorithms.
Section 5 reviews niche formation and multi-objective optimization discussing formation of niches in
PSO and niching in dynamic environments. This is followed in Section 6 by an informative review
of the applications of PSO in discrete optimization problems and in Section 7 by notes on ensemble
optimizers. Section 8 presents notes on benchmark solution quality and performance comparison
practices and finally, Section 9 outlines future directions and concludes the paper.
contingent on the current position being less than the best position and vice-versa (Shi and Eberhart,
1998) [8].
The random number generator was originally multiplied by 2 in [1] so that particles could have
an overshoot across the target in the search space half of the time. These values of the constants,
known as the cognition and social acceleration co-efficient were found to effect superior performance
than previous versions. Since its introduction in 1995, the PSO algorithm has undergone numerous
improvements and extensions aimed at guaranteeing convergence, preserving and improving diversity
as well as offsetting the inherent shortcomings by hybridizing with parallel EC paradigms.
4. Perspectives on Development
(tmax − t)
ωt = (ωmax − ωmin ) + ωmax (2)
tmax
Mach. Learn. Knowl. Extr. 2019, 1 160
where tmax is the number of iterations, t is the current iteration and ωt is the value of the inertia weight
in the t-th iteration.
There are some implementations that look at the effects of increasing the inertia weight from an
initial low value to a high value, the interested reader should refer to [13,14].
(tmax − t)
ωt+1 = (ωt − 0.4) (3)
tmax + 0.4
where ωt=0 = 0.9 is the initial choice of ω. Clerc introduced the concept of relative improvement of
the swarm in developing an adaptive inertia weight [16]. The change in the inertia of the swarm is in
proportion to the relative improvement of the swarm. The relative improvement κit is estimated by:
f (lbesttt ) − f ( xtt )
κit = (4)
f (lbesttt ) + f ( xtt )
e mi ( t ) − 1
ωt+1 = ω0 + (ωtmax − ω0 ) (5)
e mi ( t ) + 1
where ωtmax = 0.5 and ω0 < 1. Each particle has a unique inertia depending on its distance from the
local best position.
vx [ ][ ] = χ × (vx [ ][ ] + Ω1 × rand( ) × ( pBest[ ][ ] − presentx [ ][ ]) + Ω2 × rand( ) × ( pBest[ ][ gbest] − presentx [ ][ ])) (6)
Ω = Ω1 + Ω2 (8)
Ω1 and Ω2 can be split into products of social and cognitive acceleration coefficients c1 and c2 times
random noise r1 and r2 . Under the operating constraint that Ω ≥ 4 and ν ∈ [0,1], swarm convergence
is guaranteed with particles decelerating as iteration count increases. The parameter ν controls the
Mach. Learn. Knowl. Extr. 2019, 1 161
local or global search scope of the swarm. For example, when ν is set close to 1, particles traverse the
search space with a predominant emphasis on exploration. This leads to slow convergence and a high
degree of accuracy in finding the optimum solution, as opposed to when ν is close to zero in which
case the convergence is fast but the solution quality may vary vastly. This approach of constricting the
velocities is equivalent in significance to the inertia weight variation, given its impact on determining
solution quality across neighborhoods in the search space. Empirical studies in [21] demonstrated that
faster convergence rates are achieved when velocity constriction is used in conjunction with clamping.
The cognition model performs a local search in the region where the swarm members are
initialized and tends to report suboptimal solutions if the acceleration component and upper bounds
on velocity are small. Due to its weak exploratory ability, it is also slow in convergence. This was
reported by Kennedy [22] and subsequently the subpar performance of the model was confirmed by
the works of Carlisle and Dozier [23]. The social model, on the other hand, considers only the social
component.
vt+1 [ ][ ] = (vt [ ][ ] + C1 × rand( ) × ( pBest[ ][ gBest] − presentx [ ][ ]) (10)
In this model, the particles are attracted towards the global best in the feasible neighborhood and
converge faster with predominantly exploratory behavior. This was reported by Kennedy [22] and
confirmed by Carlisle and Dozier [23].
Choice of Values
In general, the values of C1 and C2 are kept constant. An empirically found optimum pair seems
to be 2.05 for each of C1 and C2 and significant departures or incorrect initializations lead to divergent
behavior. Ratnaweera et al. suggested that C1 should be decreased linearly over time, whereas C2
should be increased linearly [27]. Clerc’s fuzzy acceleration reports improvements using swarm
diversity and the ongoing iteration by adaptively refining coefficient values [16].
Mach. Learn. Knowl. Extr. 2019, 1 162
4.5. Topologies
The topology of the swarm of particles establishes a measure of the degree of connectivity of its
members to the others. It essentially describes a subset of particles with whom a particle can initiate
information exchange [28]. The original PSO outlined two topologies that led to two variants of the
algorithm: lBest PSO and gBest PSO. The lBest variant associates a fraction of the total number of
particles in the neighborhood of any particular particle. This structure leads to multiple best particles,
one in each neighborhood and consequently the velocity update equation of the PSO has multiple social
attractors. Under such circumstances, the swarm is not attracted towards any single global best rather
a combination of subswarm bests. This brings down the convergence speed but significantly increases
the chance of finding global optima. In the gBest variant, all particles simultaneously influence the
social component of the velocity update in the swarm, thereby leading to an increased convergence
speed and a potential stagnation to local optima if the true global optima is not where the best particle
of the neighborhood is.
There have been some fundamental contributions to the development of PSO topologies over the
last two decades [29–31]. A host of PSO topologies have risen out of these efforts, most notably the
Random Topology PSO, The Von-Neumann Topology PSO, The Star Topology PSO and the Toroidal
Topology PSO. In [31], Mendes et al. studied several different sociometry with a population size of 20
where they quantified the effect of including the past experiences of an individual by implementing
with and without, the particle of interest. Interested readers can also refer to the recent work by Liu et
al to gain an understanding about topology selection in PSO driven optimization environments [32].
Ω1 + Ω2
1>ω> −1 ≥ 0 (11)
2
The above relation can be simplified by replacing the stochastic factors with the acceleration
coefficients C1 and C2 such that when C1 and C2 are chosen to satisfy the condition in Equation (12),
the swarm converges.
C + C2
1>ω> 1 −1 ≥ 0 (12)
2
Studies in [19,34,35] also lead to the implication that a particle may converge to a single point X’
which is a stochastic attractor with pBest and gBest being ends of two diagonals. This point may not be
an optimum and particles may prematurely converge to it.
Ω1 pBest + Ω2 gBest
X0 = (13)
Ω1 + Ω2
vij (t + 1) = ω × vij (t) + r1 (t) × C1 × ( pbestij (t) − xij (t)) + r2 (t) × C2 × ( gbest(t) − xij (t)) (14)
r1 and r2 are independent and identically distributed random numbers whereas C1 and C2 are the
cognition and social acceleration coefficients. xij , vij are position coordinates and velocity of the ith
agent in the jth dimension. pbestij (t) and gbest(t) represent the personal and global best locations
Mach. Learn. Knowl. Extr. 2019, 1 163
in the t-th iteration. The first term in the right-hand side of Equation (14) makes use of ω, which is
the inertia weight and the next two terms are excitations towards promising regions in the search
space as reported by the personal and global best locations. The personal best replacement procedure
assuming a function minimization objective is discussed in Equation (16). The global best gBest(t) is
the minimum cost bearing element of the temporal set of personal bests pBesti (t) of all particles over all
iterations.
if cost( xi (t + 1)) < cost( pBesti (t)) ⇒ pBesti (t + 1) = xi (t + 1)
(16)
else pBesti (t + 1) = pBesti (t)
or not using object-oriented metrics in [44]. Nik et al. used GA-PSO, PSO-GA and a collection of
other hybridization approaches to optimize surveyed asphalt pavement inspection units in massive
networks [45]. Premlatha and Natarajan [46] proposed a discrete version of PSO with embedded GA
operators for clustering purposes. The GA operator initiates reproduction when particles stagnate.
This version of the hybrid algorithm was named DPSO with mutation-crossover.
In [47] Abdel-Kader proposed a GAI-PSO hybrid algorithm for k-means clustering.
The exploration ability of the algorithm was used first to find an initial kernel of solutions containing
cluster centroids which was subsequently used by the k-means in a local search. For treating
constrained optimization problems, Garg used a PSO to operate in the direction of improving the
vector while using GA to update decision vectors [48]. In [49], Zhang et al. carried out experimental
investigations to optimize the performance of a four-cylinder, turbocharged, direct-injection diesel
engine. A hybrid PSO and GA method with a small population was tested to optimize five operating
parameters, including EGR rate, pilot timing, pilot ratio, main injection timing, and injection pressure.
Results demonstrated significant speed-up and superior optimization as compared to GA. Li et al.
developed a mathematical model of the heliostat field and optimized it using PSO-GA to determine
the highest potential daily energy collection (DEC) in [50]. Results indicated that DEC during the
spring equinox, summer solstice, autumnal equinox and winter solstice increased approximately by
1.1 × 105 MJ, 1.8 × 105 MJ, 1.2 × 105 MJ and 0.9 × 105 MJ, respectively.
A brief listing of some of the important hybrid algorithms using GA and PSO are given below in
Table 1.
Table 1. Cont.
Das et al. scrapped the cognitive component of the velocity update equation in PSO and replaced
it with a weighted difference vector of positions of any two different particles chosen randomly from
the population [68]. The modified algorithm was used to optimize well-known benchmarks as well
as constrained optimization problems. The authors demonstrated the superiority of the proposed
method, achieved through a synergism between the underlying popular multi-agent search processes:
the PSO and DE. Luitel and Venayagamoorthy [69] used a DEPSO optimizer to design linear phase
Finite Impulse Response (FIR) filters. Two different fitness functions: one based on passband and
stopband ripple, the other on MSE between desired and practical results were considered. While
promising results were obtained with respect to performance and convergence time, it was noted
that the DEPSO algorithm could also be applied to the personal best position, instead of the global
best. Vaisakh et al. [70] came up with a DEPSO algorithm to achieve optimal reactive power dispatch
with reduced power and enhanced voltage stability. The IEEE-30 bus test system is used to illustrate
its effectiveness and results confirm the superiority of the algorithm proposed. Huang et al. [71]
studied the back analysis of mechanics parameters using DEPSO-ParallelFEM—a hybrid method
using the advantages of DE fused with PSO and Finite Element Method (FEM). The DEPSO algorithm
enhances the ability to escape local minima and the FEM increases computational efficiency and
precision through involvement of Cluster of Workstation (COW), MPI (Message Passing Interface),
Domain Decomposition Method (DDM) [72,73] and object-oriented Programming (OOP) [74,75].
A computational example supports the claim that it is an efficient method to estimate and back analyze
the mechanics parameters of systems.
Xu et al. [76] applied their proposed variant of DEPSO on data clustering problems. Empirical
results obtained on synthetic and real datasets showed that DEPSO achieved faster performance than
when either of PSO or DE is used alone. Xiao and Zuo [77] used a multipopulation strategy to diversify
the population and employ every subpopulation to a different peak, subsequently using a hybrid
DEPSO operator to find the optima in each. Tests on the Moving Peaks Benchmark (MPB) problem
resulted in significantly better average offline error than competitor techniques. Junfei et al. [78]
used DEPSO for mobile robot localization purposes whereas Sahu et al. [79] proposed a new fuzzy
Proportional–Integral Derivative (PID) controller for automatic generation control of interconnected
power systems. Seyedmahmoudian et al. [80] used DEPSO to detect maximum power point under
partial shading conditions. The proposed technique worked well in achieving the Global Maximum
Power Point (GMPP): simulation and experimental results verified this under different partial shading
conditions and as such its reliability in tracking the global optima was established. Gomes and
Saraiva [81] described a hybrid evolutionary tool to solve the Transmission Expansion Planning
problem. The procedure is phased out in two parts: first equipment candidates are selected using a
Constructive Heuristic Algorithm and second, a DEPSO optimizer is used for final planning. A case
study based on the IEEE 24-Bus Reliability Test System using the DEPSO approach yielded solutions
of acceptable quality with low computational effort.
Boonserm and Sitjongsataporn [82] put together DE, PSO and Artificial Bee Colony (ABC) [83]
coupled with self-adjustment weights determined using a sigmoidal membership function. DE
helped eliminate the chance of premature convergence and PSO sped up the optimization process.
The inherent ABC operators helped avoid suboptimal solutions by looking for new regions when
fitness did not improve. A comparative analysis of DE, PSO, ABC and the proposed DEPSO-Scout over
benchmark functions such as Rosenbrock, Rastrigin, and Ackley was performed to support the claim
that the new metaheuristic performed better than the component paradigms viz. PSO, DE, and ABC.
A brief listing of some of the important hybrid algorithms using SA and PSO are given below in
Table 2.
Mach. Learn. Knowl. Extr. 2019, 1 167
Niknam et al. [98] made use of a proposed PSO-SA to solve the Dynamic Optical Power
Flow Problem (DOPF) with prohibited zones, valve-point effects and ramp-rate constraints taken
into consideration. The IEEE 30-bus test system was used to show the effectiveness of the
PSO-SA in searching the possible solutions to the highly nonlinear and nonconvex DOPF problem.
Sudibyo et al. [99] used SA-PSO for controlling temperatures of the trays in Methyl tert-Butyl Ether
(MTBE) reactive distillation in a Nonlinear Model Predictive Control (NMPC) problem and noted the
efficiency of the algorithm in finding the optima as a result of hybridization. Wang and Sun [100]
applied a hybrid SA-PSO to the K-Means clustering problem.
Javidrad and Nazari [101] recently contributed a hybrid PSO-SA wherein SA contributes in
updating the global best particle just when PSO does not show improvements in the performance
of the global best particle, which may occur several times during the iteration cycles. The algorithm
uses PSO in its initial phase to determine the global best and when there is no change in the global
best in any particular cycle, passes the information on to the SA phase which iterates until a rejection
takes place using the Metropolis criterion [102]. The new information about the best solution is then
passed back to the PSO phase which again initiates search with the obtained information as the new
global best. This process of sharing is sustained until convergence criteria are satisfied. Li et al. [103]
introduced an efficient energy management scheme in order to increase the fuel efficiency of a Plug-In
Hybrid Electric Vehicle (PHEV).
A brief listing of some of the important hybrid algorithms using SA and PSO are given below in
Table 3.
to ACO to come up with the hybrid FAPSO-ACO-K algorithm. They compared the results with
respect to PSO-ACO, PSO, SA, TS, GA, ACO, HBMO, PSO-SA, ACO-SA, K-Means and obtained better
convergence of FAPSO-ACO-K in most cases provided the number of clusters known beforehand.
Chen et al. [109] proposed a Genetic Simulated Annealing Ant Colony system infused with
Particle Swarm Optimization. The initial population of Genetic Algorithms was given by ACO, where
the interaction among different groups about the pheromone information was controlled by PSO.
Next, GA controlled by SA mutation techniques were used to produce superior results. Xiong and
Wang [110] used a two-stage hybrid algorithm (TAPC) combining adaptive ACO and enhanced PSO
to overcome the local optima convergence problem in a K-means clustering application environment.
Kıran et al. [111] came up with a novel hybrid approach (HAP) combining ACO and PSO. While initially
the individual behavior of randomly allocated swarm of particles and colony of ants gets predominance,
they start getting influenced by each other through the global best solution, which is determined
by comparing the best solutions of PSO and ACO at each iteration. Huang et al. [112] introduced
continuous Ant Colony Optimization (ACOR) in PSO to develop hybridization strategies based on
four approaches out of which a sequence-based approach using an enlarged pheromone-particle
table proved to be most effective. The opportunity for ACOR in exploring the search space is more as
solutions generated by PSO is associated with a pheromone-particle table. Mahi et al [113] came up with
a hybrid approach combining PSO, ACO and 3-opt algorithm where the parameters concerning the
optimization of ACO are determined by PSO and the 3-opt algorithm helps ACO to avoid stagnation
in local optima.
Kefi et al. [114] proposed Ant-Supervised PSO (ASPSO) and applied it to the Travelling Salesman
Problem (TSP) where the optimum values of the ACO parameters α and β, which used to determine the
effect of pheromone information over the heuristic one, are updated by PSO instead of being constant
as in traditional ACO. The pheromone amount and the rate of evaporation are detected by PSO: thus
with the set of supervised and adjusted parameters given by PSO, ACO plays the key optimization
methodology. Lazzus et al. [115] demonstrated vapor–liquid phase equilibrium by combining similar
attributes of PSO and ACO (PSO+ACO), where the positions discovered by the particles of PSO were
fine-tuned by the ants in the second stage through pheromone-guided techniques.
Mandloi et al. [116] presented a hybrid algorithm with a novel probabilistic search method by
integrating the distance oriented search approach practiced by ants in ACO and velocity oriented
search mechanism adopted by particles in PSO, thereby substituting the pheromone update of ACO
with velocity update of PSO. The probability metric used in this algorithm consists of weighted
heuristic values obtained from transformed distance and velocity through a sigmoid function, ensuring
fast convergence, less complexity and avoidance of stagnation in local optima. Indadul et al. [117]
solved the Travelling Salesman Problem (TSP) coordinating PSO, ACO and K-Opt Algorithm where
the preliminary set of particle swarm is produced by ACO. In the latter iterations of PSO if the position
of a particle is not changed for a given interval, then K-Opt algorithm is applied to it for upgrading
the position. Liu et al. [118] relied on the local search capacity of ACO and global search potential of
PSO and conglomerated them for application in optimizing container truck routes. Junliang et al. [119]
proposed a Hybrid Optimization Algorithm (HOA) that exploits the merit of global search and fast
convergence in PSO and in the event of premature convergence lets ACO take over. With its initial
parameters set by PSO, the algorithm then converges to the optimal solution.
A brief listing of some of the important hybrid algorithms using PSO and ACO are given below
in Table 4.
Mach. Learn. Knowl. Extr. 2019, 1 170
cuckoos in the search process. Dash et al. [129] introduced improved cuckoo search particle swarm
optimization (ICSPSO) where the optimization strategy of Differential Evolution (DE) Algorithm
is incorporated for searching effectively around the group best solution vectors at each iteration,
ensuring the global search capability of hybrid CSPSO and implemented this in designing linear phase
multiband stop filters.
A brief listing of some of the important hybrid algorithms using PSO and CS are given below in
Table 5.
Li et al. [137] introduced PS-ABC comprising a local search phase of PSO and two global search
phases of ABC. Depending on the extent of aging of personal best in PSO each entity at each iteration
adopts either of PSO, onlooker bee or scout bee phase. This algorithm was shown to be efficient for
high dimensional datasets with faster convergence. Sedighizadeh and Mazaheripour [138] came up
with a PSO-ABC algorithm with initially assigned personal bests for each entity and further refining it
through a PSO phase and ABC phase and the best of all personal bests is returned as global best at the
end. This algorithm was shown to find the optimal route faster in vehicle routing problems compared
to some competing algorithms.
A brief listing of some of the important hybrid algorithms using PSO and ABC are given below in
Table 6.
between function evaluations. Six benchmark functions were tested producing improvements in
convergence speed and accuracy over either of PSO or BA.
An application of a hybrid PSO-BA in medical image registration was demonstrated by
Manoj et al. [149] where the authors noted that the hybrid algorithm was more successful in finding
optimal parameters for the problem as compared to relevant methods already in use.
Table 7. A collection of PSO algorithms hybridized with other approaches such as AIS, BA, FA and GSO.
information exchange among them. On the other hand, masterslave configurations attempt to
designate a master processor which assigns slave processors to work on fitness evaluation of many
particles simultaneously. Early work by Gies and Rahmat-Samii reported a performance gain of
eight times using a system with 10 nodes for a parallel implementation over s serial one [154].
Schutte et al. [155] evaluated a parallel implementation of the algorithm on two types of test problems:
(a) large scale analytical problems with inexpensive function evaluations and (b) medium scale
problems on biomechanical system identification with computationally heavy function evaluations.
The results of experimental analysis under load-balanced and load-imbalanced conditions highlighted
several promising aspects of parallelization. The authors used a synchronous scheme based on a
master-slave approach. The use of data pools in [156], independent evaluation of fitness leading to
establishing the dependency of efficiency on the social information exchange strategy in [157] and
exploration of enhanced topologies for information exchange in multiprocessor architectures in [158]
may be of relevance to an interested reader.
Rymut’s work on parallel PSO-based Particle Filtering showed how CUDA capable Graphics
Processing Units (GPUs) can accelerate object tracking algorithm performances using adaptive
appearance models [159]. A speedup factor of 40 was achieved by using GPUs over CPUs. Zhang et al.
fused GA and PSO to address the sample impoverishment problem and sample size dependency
in particle filters [160,161]. Chen et al. [162] proposed an efficient parallel PSO algorithm to find
optimal design criteria for Central Composite Discrepancy (CCD) criterion whereas Awwad used a
CUDA-based approach to solve the topology control issue in hybrid radio frequency and wireless
networks in optics [163]. Qu et al. used a serial and parallel implementation of PSO in the Graph
Drawing problem [164] and reported that both methods are as effective as the force-directed method in
the work, with the parallel method being superior to the serial one when large graphs were considered.
Zhou et al. found that using a CUDA implementation of the Standard PSO (SPSO) with a local
topology [165] on four benchmark problems, the runtime of GPU-SPSO indicates clear superiority
over CPU-SPSO. They also noted that runtime and swarm size assumed a linear relationship in case
of GPU-SPSO. Mussi et al. reported in [166] an in-depth performance evaluation of two variants of
parallel algorithms with the sequential implementation of PSO over standard benchmark functions.
The study included assessing the computational efficiency of the parallel methods by considering
speedup and scaleup against the sequential version.
was improved by the introduction of the Deflection and Repulsion techniques in [170] by Parsopoulous
and Vrahatis. The nbest PSO by Brits et al. [171] used local neighborhoods based on spatial proximity
and achieved a parallel niching effect in a swarm whereas the NichePSO by the same authors in [172]
achieved multiple solutions to multimodal problems using subswarms generated from the main swarm
when a possible niche was detected. A speciation-based PSO [173] was developed keeping in mind the
classification of particles within a threshold radius from the neighborhood best (also known as the
seed), as those belonging to a particular species. In an extension which sought to eliminate the need
for a user specified radius of niching, the Adaptive Niching PSO (ANPSO) was proposed in [174,175].
It adaptively determines the radius by computing the average distance between each particle and its
closest neighbor. A niche is said to have formed if particles are found to be within the niching radius
for an extended number of iterations in which case particles are classified into two groups: niched and
un-niched. A global PSO is used for information exchange within the niches whereas an lbest PSO
with a Von-Neumann topology is used for the same in case of un-niched particles. Although ANPSO
eliminates the requirement of specifying a niche radius beforehand, the solution quality may become
sensitive due to the addition of new parameters. The Fitness Euclidean Distance Ratio PSO (FER-PSO)
proposed by Li [176] uses a memory swarm alongside an explorer swarm to guide particles towards the
promising regions in the search space. The memory swarm is constructed out of personal bests found
so far whereas the explorer swarm is constructed out of the current positions of the particle. Each
particle is attracted towards a fittest and closest point in the neighborhood obtained by computing
its fitness Euclidean ratio (FER). FER-PSO introduces a scaling parameter in the computation of FER,
however it can reliably locate all global optima when population sizes are large. Clustering techniques
such as k-means have been incorporated into the PSO framework by Kennedy [177] as well as by
Passaro and Starita [178] who used the Bayesian Information Criterion (BIC) [179] to estimate the
parameter k and found the approach comparable to SPSO and ANPSO.
6.2. Binarization
A widely used binarization approach maps the updated velocity at the end of an iteration into
the closed interval [0,1] using a sigmoid function. The updated velocity represents the probability that
the updated position takes the value of 1 since a high enough velocity implies the sigmoid function
Mach. Learn. Knowl. Extr. 2019, 1 176
outputs 1. The value of the maximum velocity is often clamped to a low value to make sure there
is a chance of a reversal of sigmoid output value. Afshinmanesh et al. [184] used modified flight
equations based on XOR and OR operations in Boolean algebra. Negative selection mechanisms in
immune systems inspire a velocity bounding constraint on such approaches. Deligkaris et al [185] used
a mutation operator on particle velocities to render better exploration capabilities to the Binary PSO.
variants possessed better global exploration capabilities. Further observation also supported the
claim that the variant of PSO with only constriction factor was significantly faster than the one with
only inertia weight. These results affirmed that the performance of the variants was not affected
by truncation of the real parameter values of the particles. Yare and Venayagamoorthy [194] used a
discrete PSO for optimal scheduling of generator maintenance, Eajal and El-Hawary [195] approached
the problem of optimal placement and sizing of capacitors in unbalanced distribution systems with the
consideration of including harmonics. More recently, Phung et al. [196] used a discretized version of
PSO path planning for UAV vision-based surface inspection and Gong et al. [197] attempted influence
maximization in social networks. Aminbakhsh and Sonmez [198] presented a discrete particle swarm
optimization (DPSO) for an effective solution to large-scale discrete time-cost trade-off problem
(DTCTP). The authors noted that the experiments provided high quality solutions for time-cost
optimization of large size projects within seconds and enabled optimal planning of real life-size
projects. Li et al. [199] modeled complex network clustering as a multiobjective optimization problem
and applied a quantum inspired discrete particle swarm optimization algorithm with non-dominated
sorting for individual replacement to solve it. Experimental results illustrated its competitiveness
against some state-of-the-art approaches on the extensions of Girvan and Newman benchmarks [200]
as well as many real-world networks. Ates et al. [201] presented a discrete Infinite Impulse Response
(IIR) filter design method for approximate realization of fractional order continuous filters using a
Fractional Order Darwinian Particle Swarm Optimization (FODPSO).
f5 f5 f5 f5 f6 f6 f6 f6 f7 f7 f7 f7 f8 f8 f8 f8
f5
f5 f6 ff66 f6 f6 f7
f7 f7 f7 f8
f8 f8 f8 f8
f5 f5 f5 f7
Function Performance PSO [208] PSO [209] PSO [210] PSOGSA [210] DEPSO [150]
Mean 1.36 × 10−4 1.8 × 10−3 2.83 × 10−4 6.66 × 10−19 1.60 × 10−26
f1
St. Dev 2.02 × 10−4 NR NR NR 6.56 × 10−26
Mean 4.21 × 10−2 2.0 × 10+0 5.50 × 10−3 3.79 × 10−19 2.89 × 10−13
f2
St. Dev 4.54 × 10−2 NR NR NR 1.54 × 10−12
Mean 7.01 × 10+1 4.1 × 10+3 5.19 × 10+3 4.09 × 10+2 3.71 × 10−1
f3
St. Dev 2.21 × 10+1 NR NR NR 2.39 × 10−1
Mean 9.67 × 10+1 3.6 × 10+4 2.01 × 10+2 5.62 × 10+1 4.20 × 10+1
f4
St. Dev 6.01 × 10+1 NR NR NR 3.28 × 10+1
Mean −4.84 × 10+3 −9.8 × 10+3 −5.92 × 10+3 −1.22 × 10+4 4.68 × 10+3
f5
St. Dev 1.15 × 10+3 NR NR NR 9.42 × 10+2
Mean 4.67 × 10+1 5.51 × 10+1 7.23 × 10+1 2.27 × 10+1 4.07 × 10+1
f6
St. Dev 1.16 × 10+1 NR NR NR 1.19 × 10+1
Mean 2.76 × 10−1 9.0 × 10−3 4.85 × 10−10 6.68 × 10−12 2.98 × 10−13
f7
St. Dev 5.09 × 10−1 NR NR NR 1.51 × 10−12
Mean 9.21 × 10−3 1.0 × 10−2 5.43 × 10−3 1.48 × 10−3 1.69 × 10−2
f8
St. Dev 7.72 × 10−3 NR NR NR 1.82 × 10−2
Note: NR: Not Reported.
9. Future Directions
Two decades of exciting developments in the Particle Swarm paradigm has seen many exciting
upheavals and successes alike. The task of detecting a global among the presence of many local optima,
the arbitrary nature of the search space and the intractability of using conventional mathematical
abstractions on a wide range of objective functions coupled with little or no guarantees apriori about
any optima being found made the search process challenging. However, Particle Swarm Optimizers
have had their fair share of success stories—they can be used on any objective function: continuous or
discontinuous, tractable or intractable, even for those where initialization renders solution quality to
be sensitive as evidenced in case of their deterministic counterparts. However, some pressing issues
which are listed below merit further work by the PSO community.
1. Parameter sensitivity: Solution quality of metaheuristics like PSO are sensitive to their parametric
evolutions. This means that the same strategy of parameter selection does not work for
every problem.
2. Convergence to local optima: Unless the basic PSO is substantially modified to take into account
the modalities of the objective function, more often than not it falls prey to local optima in the
search space for sufficiently complex objective functions.
3. Subpar performance in multi-objective optimization for high dimensional problems: Although
niching techniques render acceptable solutions for multimodal functions in both static and
dynamic environments, the solution quality falls sharply when the dimensionality of the
problem increases.
Ensemble optimizers, although promising, do not address the underlying shortcomings of the
basic PSO. Theoretical issues, such as the particle explosion problem, loss of particle diversity as
well as stagnation to local optima deserve the attention of researchers so that a unified algorithmic
framework with more intelligent self-adaptation and less user-specified customizations can be realized
for future applications.
Author Contributions: S.S. created the structure and organization of the work, reviewed and instituted the
content in all sections and commented on the quantitative aspects of the PSO algorithm. S.B. co-reviewed and
instituted the content in Section 4.8 and commented on the hybridization perspectives in applied problems. Both
S.S. and S.B. contributed to the final version of the manuscript. R.A.P.II advised on the mathematical nature of
the meta-heuristics and provided critical analyses of related work. All authors approve of the final version of
the manuscript.
Funding: This research received no external funding.
Acknowledgments: This work was made possible by the financial and computing support by the Vanderbilt
University Department of EECS. The authors would like to thank the anonymous reviewers for their valuable
comments for further improving the content of this article.
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the IEEE International Conference
on Neural Networks, Perth, Australia, 27 November–1 December 1995.
2. Holland, J.H. Adaptation in Natural and Artificial Systems; University of Michigan Press: Ann Arbor, MI, USA,
1975.
3. Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over
continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [CrossRef]
4. Sun, J.; Feng, B.; Xu, W.B. Particle swarm optimization with particles having quantum behavior.
In Proceedings of the IEEE Congress on Evolutionary Computation, Portland, OR, USA, 19–23 June 2004;
pp. 325–331.
5. Sun, J.; Xu, W.B.; Feng, B. A global search strategy of quantum-behaved particle swarm optimization.
In Proceedings of the 2004 IEEE Conference on Cybernetics and Intelligent Systems, Singapore, 1–3 December
2004; pp. 111–116.
Mach. Learn. Knowl. Extr. 2019, 1 181
6. Reeves, W.T. Particle systems—A technique for modelling a class of fuzzy objects. ACM Trans. Graph. 1983,
2, 91–108. [CrossRef]
7. Reynolds, C.W. Flocks, herds, and schools: A distributed behavioral model. ACM Comput. Graph. 1987, 21,
25–34. [CrossRef]
8. Shi, Y.; Eberhart, R.C. Parameter selection in particle swarm optimization. In Proceedings of the 7th
International Conference on Computation Programming VII, London, UK, 25–27 March 1998.
9. Shi, Y.; Eberhart, R. A modified particle swarm optimizer. In Proceedings of the 1998 IEEE International
Conference on Evolutionary Computation Proceedings, IEEE World Congress on Computational Intelligence,
Anchorage, AK, USA, 4–9 May 1998; pp. 69–73.
10. Eberhart, R.C.; Shi, Y. Particle Swarm Optimization: Developments, Applications and Resources.
In Proceedings of the IEEE Congress on Evolutionary Computation, Seoul, Korea, 27–30 May 2001; Volume 1,
pp. 27–30.
11. Suganthan, P.N. Particle Swarm Optimiser with Neighborhood Operator. In Proceedings of the IEEE
Congress on Evolutionary Computation, Washington, DC, USA, 6–9 July 1999; pp. 1958–1962.
12. Ratnaweera, A.; Halgamuge, S.; Watson, H. Particle Swarm Optimization with Self-Adaptive Acceleration
Coefficients. In Proceedings of the First International Conference on Fuzzy Systems and Knowledge
Discovery, Guilin, China, 14–17 October 2003; pp. 264–268.
13. Zheng, Y.; Ma, L.; Zhang, L.; Qian, J. On the Convergence Analysis and Parameter Selection in Particle
Swarm Optimization. In Proceedings of the International Conference on Machine Learning and Cybernetics,
Xi’an, China, 5 November 2003; Volume 3, pp. 1802–1807.
14. Zheng, Y.; Ma, L.; Zhang, L.; Qian, J. Empirical Study of Particle Swarm Optimizer with Increasing Inertia
Weight. In Proceedings of the IEEE Congress on Evolutionary Computation, Canberra, ACT, Australia, 8–12
December 2003; pp. 221–226.
15. Naka, S.; Genji, T.; Yura, T.; Fukuyama, Y. Practical Distribution State Estimation using Hybrid Particle
Swarm Optimization. In Proceedings of the IEEE Power Engineering Society Winter Meeting, Columbus,
OH, USA, 28 January–1 February 2001; Volume 2, pp. 815–820.
16. Clerc, M. Think Locally, Act Locally: The Way of Life of Cheap-PSO, an Adaptive PSO. Technical Report.
2001. Available online: http://clerc.maurice.free.fr/pso/ (accessed on 8 October 2018).
17. Shi, Y.; Eberhart, R.C. Fuzzy Adaptive Particle Swarm Optimization. In Proceedings of the IEEE Congress on
Evolutionary Computation, Seoul, Korea, 27–30 May 2001; Volume 1, pp. 101–106.
18. Eberhart, R.C.; Simpson, P.K.; Dobbins, R.W. Computational Intelligence PC Tools, 1st ed.; Academic Press
Professional: Cambridge, MA, USA, 1996.
19. Clerc, M.; Kennedy, J. The Particle Swarm-Explosion, Stability and Convergence in a Multidimensional
Complex Space. IEEE Trans. Evol. Comput. 2002, 6, 58–73. [CrossRef]
20. Clerc, M. The Swarm and the Queen: Towards a Deterministic and Adaptive Particle Swarm Optimization.
In Proceedings of the IEEE Congress on Evolutionary Computation, Washington, DC, USA, 6–9 July 1999;
Volume 3, pp. 1951–1957.
21. Eberhart, R.C.; Shi, Y. Comparing Inertia Weights and Constriction Factors in Particle Swarm Optimization.
In Proceedings of the IEEE Congress on Evolutionary Computation, La Jolla, CA, USA, 16–19 July 2000;
Volume 1, pp. 84–88.
22. Kennedy, J. The Particle Swarm: Social Adaptation of Knowledge. In Proceedings of the IEEE International
Conference on Evolutionary Computation, Indianapolis, IN, USA, 13–16 April 1997; pp. 303–308.
23. Carlisle, A.; Dozier, G. Adapting Particle Swarm Optimization to Dynamic Environments. In Proceedings
of the International Conference on Artificial Intelligence, Langkawi, Malaysia, 20–22 September 2000;
pp. 429–434.
24. Stacey, A.; Jancic, M.; Grundy, I. Particle Swarm Optimization with Mutation. In Proceedings of the 2003
Congress on Evolutionary Computation, Canberra, ACT, Australia, 8–12 December 2003; pp. 1425–1430.
25. Jie, X.; Deyun, X. New Metropolis Coefficients of Particle Swarm Optimization. In Proceedings of the 2008
Chinese Control and Decision Conference, Yantai, Shandong, China, 2–4 July 2008; pp. 3518–3521.
26. Kirkpatrick, S.; Gelatt, C.; Vecci, M. Optimization by Simulated Annealing. Science 1983, 220, 671–680.
[CrossRef] [PubMed]
Mach. Learn. Knowl. Extr. 2019, 1 182
27. Ratnaweera, A.; Halgamuge, S.; Watson, H. Particle Swarm Optimization with Time Varying Acceleration
Coefficients. In Proceedings of the International Conference on Soft Computing and Intelligent Systems,
Coimbatore, India, 26–28 July 2002; pp. 240–255.
28. Kennedy, J.; Mendes, R. Population structure and particle swarm performance. In Proceedings of the 2002
Congress on Evolutionary Computation, CEC’02, Honolulu, HI, USA, 12–17 May 2002.
29. Kennedy, J. Small Worlds and Mega-Minds: Effects of Neighbourhood Topology on Particle Swarm
Performance. In Proceedings of the IEEE Congress on Evolutionary Computation, Washington, DC, USA,
6–9 July 1999; Volume 3, pp. 1931–1938.
30. Kennedy, J.; Mendes, R. Population Structure and Particle Swarm. In Proceedings of the IEEE Congress on
Evolutionary Computation, Honolulu, HI, USA, 12–17 May 2002; pp. 1671–1676.
31. Mendes, R.; Kennedy, J.; Neves, J. Watch thy Neighbour or How the Swarm can Learn from its Environment.
In Proceedings of the IEEE Swarm Intelligence Symposium, Indianapolis, IN, USA, 26 April 2003; pp. 88–94.
32. Liu, Q.; Wei, W.; Yuan, H.; Zhan, Z.H.; Li, Y. Topology selection for particle swarm optimization. Inf. Sci.
2016, 363, 154–173. [CrossRef]
33. van den Bergh, F. An Analysis of Particle Swarm Optimizers. Ph.D. Thesis, Department of Computer Science,
University of Pretoria, Pretoria, South Africa, 2002.
34. van den Bergh, F.; Engelbrecht, A.P. A Study of Particle Swarm Optimization Particle Trajectories. Inf. Sci.
2006, 176, 937–971. [CrossRef]
35. Trelea, L.C. The Particle Swarm Optimization Algorithm: Convergence Analysis and Parameter Selection.
Inf. Process. Lett. 2003, 85, 317–325. [CrossRef]
36. Robinson, J.; Sinton, S.; Rahmat-Samii, Y. Particle Swarm, Genetic Algorithm, and Their Hybrids:
Optimization of a Profiled Corrugated Horn Antenna. In Proceedings of the IEEE Antennas and Propagation
Society International Symposium and URSI National Radio Science Meeting, San Antonio, TX, USA, 16–21
June 2002; Volume 1, pp. 314–317.
37. Shi, X.; Lu, Y.; Zhou, C.; Lee, H.; Lin, W.; Liang, Y. Hybrid Evolutionary Algorithms Based on PSO and GA.
In Proceedings of the IEEE Congress on Evolutionary Computation, Rio de Janeiro, Brazil, 13–15 December
2003; Volume 4, pp. 2393–2399.
38. Yang, B.; Chen, Y.; Zhao, Z. A hybrid evolutionary algorithm by combination of PSO and GA for
unconstrained and constrained optimization problems. In Proceedings of the IEEE International Conference
on Control and Automation, Guangzhou, China, 30 May–1 June 2007; pp. 166–170.
39. Li, T.; Xu, L.; Shi, X.W. A hybrid of genetic algorithm and particle swarm optimization for antenna design.
PIERS Online 2008, 4, 56–60.
40. Valdez, F.; Melin, P.; Castillo, O. Evolutionary method combining particle swarm optimization and genetic
algorithms using fuzzy logic for decision making. In Proceedings of the IEEE International Conference on
Fuzzy Systems, Jeju Island, Korea, 20–24 August 2009; pp. 2114–2119.
41. Ghamisi, P.; Benediktsson, J.A. Feature selection based on hybridization of genetic algorithm and particle
swarm optimization. IEEE Geosci. Remote Sens. Lett. 2015, 12, 309–313. [CrossRef]
42. Benvidi, A.; Abbasi, S.; Gharaghani, S.; Tezerjani, M.D.; Masoum, S. Spectrophotometric determination of
synthetic colorants using PSO-GA-ANN. Food Chem. 2017, 220, 377–384. [CrossRef] [PubMed]
43. Yu, S.; Wei, Y.-M.; Wang, K.A. PSO–GA optimal model to estimate primary energy demand of China. Energy
Policy 2012, 42, 329–340. [CrossRef]
44. Moussa, R.; Azar, D. A PSO-GA approach targeting fault-prone software modules. J. Syst. Softw. 2017, 132,
41–49. [CrossRef]
45. Nik, A.A.; Nejad, F.M.; Zakeri, H. Hybrid PSO and GA approach for optimizing surveyed asphalt pavement
inspection units in massive network. Autom. Constr. 2016, 71, 325–345. [CrossRef]
46. Premalatha, K.; Natarajan, A.M. Discrete PSO with GA operators for document clustering. Int. J. Recent
Trends Eng. 2009, 1, 20–24.
47. Abdel-Kader, R.F. Genetically improved PSO algorithm for efficient data clustering. In Proceedings of the
International Conference on Machine Learning and Computing, Bangalore, India, 9–11 September 2010;
pp. 71–75.
48. Garg, H. A hybrid PSO-GA algorithm for constrained optimization problems. Appl. Math. Comput. 2016, 274,
292–305. [CrossRef]
Mach. Learn. Knowl. Extr. 2019, 1 183
49. Zhang, Q.; Ogren, R.M.; Kong, S.C. A comparative study of biodiesel engine performance optimization
using enhanced hybrid PSO–GA and basic GA. Appl. Energy 2016, 165, 676–684. [CrossRef]
50. Li, C.; Zhai, R.; Liu, H.; Yang, Y.; Wu, H. Optimization of a heliostat field layout using hybrid PSO-GA
algorithm. Appl. Therm. Eng. 2018, 128, 33–41. [CrossRef]
51. Krink, T.; Løvbjerg, M. The lifecycle model: Combining particle swarm optimization, genetic algorithms and
hill climbers. Proc. Parallel Prob. Solvl. From Nat. 2002, 621–630. [CrossRef]
52. Conradie, E.; Miikkulainen, R.; Aldrich, C. Intelligent process control utilising symbiotic memetic
neuro-evolution. In Proceedings of the IEEE Congress on Evolutionary Computation, Honolulu, HI, USA,
12–17 May 2002; Volume 1, pp. 623–628.
53. Grimaldi, E.A.; Grimacia, F.; Mussetta, M.; Pirinoli, P.; Zich, R.E. A new hybrid genetical—Swarm algorithm
for electromagnetic optimization. In Proceedings of the International Conference on Computational
Electromagnetics and its Applications, Beijing, China, 1–4 November 2004; pp. 157–160.
54. Juang, C.F. A hybrid of genetic algorithm and particle swarm optimization for recurrent network design.
IEEE Trans. Syst. Man Cybern. Part B Cybern. 2004, 34, 997–1006. [CrossRef]
55. Settles, M.; Soule, T. Breeding swarms: A GA/PSO hybrid. In Proceedings of the Genetic and Evolutionary
Computation Conference 2005, Washington, DC, USA, 25–29 June 2005; pp. 161–168.
56. Jian, M.; Chen, Y. Introducing recombination with dynamic linkage discovery to particle swarm optimization.
In Proceedings of the Genetic and Evolutionary Computation Conference 2006, Seattle, DC, USA, 8–12 July
2006; pp. 85–86.
57. Esmin, A.A.; Lambert-Torres, G.; Alvarenga, G.B. Hybrid evolutionary algorithm based on PSO and GA
mutation. In Proceedings of the 6th International Conference on Hybrid Intelligent Systems, Rio de Janeiro,
Brazil, 13–15 December 2006; pp. 57–62.
58. Kim, H. Improvement of genetic algorithm using PSO and Euclidean data distance. Int. J. Inf. Technol. 2006,
12, 142–148.
59. Mohammadi, A.; Jazaeri, M. A hybrid particle swarm optimization-genetic algorithm for optimal location
of SVC devices in power system planning. In Proceedings of the 42nd International Universities Power
Engineering Conference, Brighton, UK, 4–6 September 2007; pp. 1175–1181.
60. Gandelli, A.; Grimaccia, F.; Mussetta, M.; Pirinoli, P.; Zich, R.E. Development and Validation of Different
Hybridization Strategies between GA and PSO. In Proceedings of the 2007 IEEE Congress on Evolutionary
Computation, Singapore, 25–28 September 2007; pp. 2782–2787.
61. Kao, Y.T.; Zahara, E. A hybrid genetic algorithm and particle swarm optimization for multimodal functions.
Appl. Soft Comput. 2008, 8, 849–857. [CrossRef]
62. Kuo, R.J.; Hong, C.W. Integration of genetic algorithm and particle swarm optimization for investment
portfolio optimization. Appl. Math. Inf. Sci. 2013, 7, 2397–2408. [CrossRef]
63. Price, K.; Storn, R. Differential Evolution—A Simple and Efficient Adaptive Scheme for Global Optimization Over
Continuous Spaces; Technical Report; International Computer Science Institute: Berkeley, UK, 1995.
64. Hendtlass, T. A Combined Swarm differential evolution algorithm for optimization problems. In Lecture
Notes in Computer Science, Proceedings of 14th International Conference on Industrial and Engineering Applications
of Artificial Intelligence and Expert Systems; Springer Verlag: Berlin/Heidelberg, Germany, 2001; Volume 2070,
pp. 11–18.
65. Zhang, W.J.; Xie, X.F. DEPSO: Hybrid particle swarm with differential evolution operator. In Proceedings
of the IEEE International Conference on Systems, Man and Cybernetics (SMCC), Washington, DC, USA, 8
October 2003; pp. 3816–3821.
66. Talbi, H.; Batouche, M. Hybrid particle swarm with differential evolution for multimodal image registration.
In Proceedings of the IEEE International Conference on Industrial Technology, Hammamet, Tunisia, 8–10
December 2004; Volume 3, pp. 1567–1573.
67. Hao, Z.-F.; Gua, G.-H.; Huang, H. A particle swarm optimization algorithm with differential evolution.
In Proceedings of the Sixth International Conference on Machine Learning and Cybernetics, Hong Kong,
China, 19–22 August 2007; pp. 1031–1035.
68. Das, S.; Abraham, A.; Konar, A. Particle swarm optimization and differential evolution algorithms: Technical
analysis, applications and hybridization perspectives. In Advances of Computational Intelligence in Industrial
Systems, Studies in Computational Intelligence; Liu, Y., Sun, A., Loh, H.T., Lu, W.F., Lim, E.P., Eds.; Springer
Verlag: Berlin/Heidelberg, Germany, 2008; pp. 1–38.
Mach. Learn. Knowl. Extr. 2019, 1 184
69. Luitel, B.; Venayagamoorthy, G.K. Differential evolution particle swarm optimization for digital filter design.
In Proceedings of the Congress on Evolutionary Computation (IEEE World Congress on Computational
Intelligence), Hong Kong, China, 1–6 June 2008; pp. 3954–3961.
70. Vaisakh, K.; Sridhar, M.; Linga Murthy, K.S. Differential evolution particle swarm optimization algorithm for
reduction of network loss and voltage instability. In Proceedings of the IEEE World Congress on Nature and
Biologically Inspired Computing, Coimbatore, India, 9–11 December 2009; pp. 391–396.
71. Huang, H.; Wei, Z.H.; Li, Z.Q.; Rao, W.B. The back analysis of mechanics parameters based on DEPSO
algorithm and parallel FEM. In Proceedings of the International Conference on Computational Intelligence
and Natural Computing, Wuhan, China, 6–7 June 2009; pp. 81–84.
72. James, G. Malone Automated Mesh Decomposition and Concurrent Finite Element Analysis for Hypercube
Multiprocessor Computers. Comput. Methods Appl. Mech. Eng. 1988, 70, 27–58.
73. Farhat, G. Implementation Aspects of Concurrent Finite Element Computations in Parallel Computations and Their
Impact on Computational Mechanics; ASME: New York, NY, USA, 1987.
74. Rehak, D.R.; Baugh, J.W. Alternative Programming Techniques for Finite Element Program Development.
In Proceedings of the IABSE Colloquium on Expert Systems in Civil Engineering, Bergamo, Italy, 16–20
October 1989.
75. Logozzo, F. Modular Static Analysis of Object-Oriented Languages. Ph.D. Thesis, Ecole Polytechnique, Paris,
France, June 2004.
76. Xu, R.; Xu, J.; Wunsch, D.C., II. Clustering with differential evolution particle swarm optimization.
In Proceedings of the IEEE Congress on Evolutionary Computation, Barcelona, Spain, 18–23 July 2010;
pp. 1–8.
77. Xiao, L.; Zuo, X. Multi-DEPSO: A DE and PSO Based Hybrid Algorithm in Dynamic Environments.
In Proceedings of the WCCI 2012 IEEE World Congress on Computational Intelligence, Brisbane, Australia,
10–15 June 2012.
78. Junfei, H.; Liling, M.A.; Yuandong, Y.U. Hybrid Algorithm Based Mobile Robot Localization Using DE and
PSO. In Proceedings of the 32nd International Conference on Control and Automation, Xi’an, China, 26–28
July 2013; pp. 5955–5959.
79. Sahu, B.K.; Pati, S.; Panda, S. Hybrid differential evolution particle swarm optimisation optimised fuzzy
proportional–integral derivative controller for automatic generation control of interconnected power system.
IET Gen. Transm. Distrib. 2014, 8, 1789–1800. [CrossRef]
80. Seyedmahmoudian, M.; Rahmani, R.; Mekhilef, S.; Oo, A.M.T.; Stojcevski, A.; Soon, T.K.; Ghandhari, A.S.
Simulation and hardware implementation of new maximum power point tracking technique for partially
shaded PV system using hybrid DEPSO method. IEEE Trans. Sustain. Energy 2015, 6, 850–862. [CrossRef]
81. Gomes, P.V.; Saraiva, J.T. Hybrid Discrete Evolutionary PSO for AC Dynamic Transmission Expansion
Planning. In Proceedings of the 2016 IEEE International Energy Conference (ENERGYCON), Leuven,
Belgium, 4–8 April 2016.
82. Boonserm, P.; Sitjongsataporn, S. A robust and efficient algorithm for numerical optimization problem:
DEPSO-Scout: A new hybrid algorithm based on DEPSO and ABC. In Proceedings of the 2017 International
Electrical Engineering Congress, Pattaya, Thailand, 8–10 March 2017; pp. 1–4.
83. Karaboga, D.; Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial
bee colony (ABC) algorithm. J. Glob. Optim. 2007, 39, 459–471. [CrossRef]
84. Zhao, F.; Zhang, Q.; Yu, D.; Chen, X.; Yang, Y. A hybrid algorithm based on PSO and simulated annealing
and its applications for partner selection in virtual enterprises. Adv. Intell. Comput. 2005, 3644, 380–385.
85. Yang, G.; Chen, D.; Zhou, G. A new hybrid algorithm of particle swarm optimization. Lect. Notes Comput. Sci.
2006, 4115, 50–60.
86. Gao, H.; Feng, B.; Hou, Y.; Zhu, L. Training RBF neural network with hybrid particle swarm optimization.
In ISNN 2006; Wang, J., Yi, Z., Urada, J.M., Lu, B., Urada, J.M., Lu, B.-L., Yin, H., Eds.; Springer: Heidelberg,
Germany, 2006; Volume 3971, pp. 577–583.
87. Lichman, M. UCI Machine Learning Repository; University of California, School of Information and Computer
Science: Irvine, CA, USA, 2013.
88. Chu, S.C.; Tsai, P.; Pan, J.S. Parallel Particle Swarm Optimization Algorithms with Adaptive Simulated Annealing;
Studies in Computational Intelligence Book Series; Springer: Berlin/Heidelberg, Germany, 2006; Volume 31,
pp. 261–279.
Mach. Learn. Knowl. Extr. 2019, 1 185
89. Sadati, N.; Amraee, T.; Ranjbar, A. A global particle swarm-based-simulated annealing optimization
technique for under-voltage load shedding problem. Appl. Soft Comput. 2009, 9, 652–657. [CrossRef]
90. Ma, P.C.; Tao, F.; Liu, Y.L.; Zhang, L.; Lu, H.X.; Ding, Z. A hybrid particle swarm optimization and simulated
annealing algorithm for job-shop scheduling. In Proceedings of the 2014 IEEE International Conference on
Automation Science and Engineering (CASE), Taipei, Taiwan, 18–22 August 2014; pp. 125–130.
91. Ge, H.; Du, W.; Qian, F. A Hybrid Algorithm Based on Particle Swarm Optimization and Simulated Annealing
for Job Shop Scheduling. In Proceedings of the Third International Conference on Natural Computation
(ICNC 2007), Haikou, China, 24–27 August 2007; pp. 715–719.
92. Zhang, X.-F.; Koshimura, M.; Fujita, H.; Hasegawa, R. An efficient hybrid particle swarm optimization for the
job shop scheduling problem. In Proceedings of the 2011 IEEE International Conference on Fuzzy Systems,
Taipei, Taiwan, 27–30 June 2011; pp. 622–626.
93. Song, X.; Cao, Y.; Chang, C. A Hybrid Algorithm of PSO and SA for Solving JSP. In Proceedings of the
2008 Fifth International Conference on Fuzzy Systems and Knowledge Discovery, Shandong, China, 18–20
October 2008; pp. 111–115.
94. Dong, X.; Ouyang, D.; Cai, D.; Zhang, Y.; Ye, Y. A hybrid discrete PSO-SA algorithm to find optimal
elimination orderings for Bayesian networks. In Proceedings of the 2010 2nd International Conference on
Industrial and Information Systems, Dalian, China, 10–11 July 2010; pp. 510–513.
95. Shieh, H.-L.; Kuo, C.-C.; Chiang, C.-M. Modified particle swarm optimization algorithm with simulated
annealing behavior and its numerical verification. Appl. Math. Comput. 2011, 218, 4365–4383. [CrossRef]
96. Idoumghar, L.; Melkemi, M.; Schott, R.; Aouad, M.I. Hybrid PSO-SA Type Algorithms for Multimodal
Function Optimization and Reducing Energy Consumption in Embedded Systems. Appl. Comput. Intell.
Soft Comput. 2011, 2011, 138078. [CrossRef]
97. Tajbakhsh, A.; Eshghi, K.; Shamsi, A. A hybrid PSO-SA algorithm for the travelling tournament problem.
Eur. J. Ind. Eng. 2012, 6, 2–25. [CrossRef]
98. Niknam, T.; Narimani, M.R.; Jabbari, M. Dynamic optimal power flow using hybrid particle swarm
optimization and simulated annealing. Int. Trans. Electr. Energy Syst. 2013, 23, 975–1001. [CrossRef]
99. Sudibyo, S.; Murat, M.N.; Aziz, N. Simulated Annealing Particle Swarm Optimization (SA-PSO): Particle
distribution study and application in Neural Wiener-based NMPC. In Proceedings of the 10th Asian Control
Conference, Kota Kinabalu, Malaysia, 31 May–3 June 2015.
100. Wang, X.; Sun, Q. The Study of K-Means Based on Hybrid SA-PSO Algorithm. In Proceedings of the 2016
9th International Symposium on Computational Intelligence and Design (ISCID), Hangzhou, China, 10–11
December 2016; pp. 211–214.
101. Javidrad, F.; Nazari, M. A new hybrid particle swarm and simulated annealing stochastic optimization
method. Appl. Soft Comput. 2017, 60, 634–654. [CrossRef]
102. Metropolis, N.; Rosenbluth, A.W.; Rosenbluth, M.N.; Teller, A.H.; Teller, E. Equations of state calculations by
fast computing machines. J. Chem. Phys. 1953, 21, 1087–1092. [CrossRef]
103. Li, P.; Cui, N.; Kong, Z.; Zhang, C. Energy management of a parallel plug-in hybrid electric vehicle based on
SA-PSO algorithm. In Proceedings of the 2017 36th Chinese Control Conference (CCC), Dalian, China, 26–28
June 2017; pp. 9220–9225.
104. Colorni, A.; Dorigo, M.; Maniezzo, V. Distributed Optimization by Ant Colonies. In Actes de la Première
Conférence Européenne sur la vie Artificielle, Paris, France; Elsevier Publishing: Amsterdam, The Netherlands,
1991; pp. 134–142.
105. Shelokar, P.S.; Siarry, P.; Jayaraman, V.K.; Kulkarni, B.D. Particle swarm and ant colony algorithms hybridized
for improved continuous optimization. Appl. Math. Comput. 2007, 188, 129–142. [CrossRef]
106. Kaveh, A.; Talatahari, S. A particle swarm ant colony optimization for truss structures with discrete variables.
J. Constr. Steel Res. 2009, 65, 1558–1568. [CrossRef]
107. Kaveh, A.; Talatahari, S. Particle swarm optimizer, ant colony strategy and harmony search scheme
hybridized for optimization of truss structures. Comput. Struct. 2009, 87, 267–283. [CrossRef]
108. Niknam, T.; Amiri, B. An efficient hybrid approach based on PSO, ACO and k-means for cluster analysis.
Appl. Soft Comput. 2010, 10, 183–197. [CrossRef]
109. Chen, S.M.; Chien, C. Solving the traveling salesman problem based on the genetic simulated annealing
ant colony system with particle swarm optimization techniques. Expert Syst. Appl. 2011, 38, 14439–14450.
[CrossRef]
Mach. Learn. Knowl. Extr. 2019, 1 186
110. Xiong, W.; Wang, C. A novel hybrid clustering based on adaptive ACO and PSO. In Proceedings of the 2011
International Conference on Computer Science and Service System (CSSS), Nanjing, China, 27–29 June 2011;
pp. 1960–1963.
111. Kıran, M.S.; Özceylan, E.; Gündüz, M.; Paksoy, T. A novel hybrid approach based on Particle Swarm
Optimization and Ant Colony Algorithm to forecast energy demand of Turkey. Energy Convers. Manag. 2012,
53, 75–83. [CrossRef]
112. Huang, C.L.; Huang, W.C.; Chang, H.Y.; Yeh, Y.C.; Tsai, C.Y. Hybridization strategies for continuous ant
colony optimization and particle swarm optimization applied to data clustering. Appl. Soft Comput. 2013, 13,
3864–3872. [CrossRef]
113. Mahi, M.; Baykan, Ö.K.; Kodaz, H. A new hybrid method based on Particle Swarm Optimization, Ant
Colony Optimization and 3-Opt algorithms for Traveling Salesman Problem. Appl. Soft Comput. 2015, 30,
484–490. [CrossRef]
114. Kefi, S.; Rokbani, N.; Krömer, P.; Alimi, A.M. A New Ant Supervised PSO Variant Applied to Traveling
Salesman Problem. In Proceedings of the The 15th International Conference on Hybrid Intelligent Systems
(HIS), Seoul, Korea, 16–18 November 2015; pp. 87–101.
115. Lazzus, J.A.; Rivera, M.; Salfate, I.; Pulgar-Villarroel, G.; Rojas, P. Application of particle swarm+ant colony
optimization to calculate the interaction parameters on phase equilibria. J. Eng. Thermophys. 2016, 25,
216–226. [CrossRef]
116. Mandloi, M.; Bhatia, V. A low-complexity hybrid algorithm based on particle swarm and ant colony
optimization for large-MIMO detection. Expert Syst. Appl. 2016, 50, 66–74. [CrossRef]
117. Indadul, K.; Maiti, M.K.; Maiti, M. Coordinating Particle Swarm Optimization, Ant Colony Optimization
and K-Opt Algorithm for Traveling Salesman Problem. In Proceedings of the Mathematics and Computing:
Third International Conference, ICMC 2017, Haldia, India, 17–21 January 2017; Springer: Singapore, 2017;
pp. 103–119.
118. Liu, Y.; Feng, M.; Shahbazzade, S. The Container Truck Route Optimization Problem by the Hybrid PSO-ACO
Algorithm, Intelligent Computing Theories and Application. In Proceedings of the 13th International
Conference, ICIC 2017, Liverpool, UK, 7–10 August 2017; pp. 640–648.
119. Lu, J.; Hu, W.; Wang, Y.; Li, L.; Ke, P.; Zhang, K. A Hybrid Algorithm Based on Particle Swarm Optimization
and Ant Colony Optimization Algorithm, Smart Computing and Communication. In Proceedings of the
First International Conference (SmartCom 2016), Shenzhen, China, 17–19 December 2016; pp. 22–31.
120. Yang, X.S.; Deb, S. Cuckoo Search via Lévy flights. In Proceedings of the 2009 World Congress on Nature &
Biologically Inspired Computing (NaBIC), Coimbatore, India, 9–11 December 2009; pp. 210–214.
121. Ghodrati, A.; Lotfi, S. A hybrid cs/ga algorithm for global optimization. In Proceedings of the International
Conference on Soft Computing for Problem Solving (SocProS 2011), Kaohsiung, Taiwan, 20–22 December
2011; pp. 397–404.
122. Nawi, N.M.; Rehman, M.Z.; Aziz, M.A.; Herawan, T.; Abawajy, J.H. Neural network training by hybrid
accelerated cuckoo particle swarm optimization algorithm. In Proceedings of the International Conference on
Neural Information Processing; Springer International Publishing: Berlin/Heidelberg, Germany, November
2014; pp. 237–244.
123. Enireddy, V.; Kumar, R.K. Improved cuckoo search with particle swarm optimization for classification of
compressed images. Sadhana 2015, 4, 2271–2285. [CrossRef]
124. Ye, Z.; Wang, M.; Wang, C.; Xu, H. P2P traffic identification using support vector machine and cuckoo
search algorithm combined with particle swarm optimization algorithm. In Frontiers in Internet Technologies;
Springer: Berlin/Heidelberg, Germany, 2014; pp. 118–132.
125. Li, X.T.; Yin, M.H. A particle swarm inspired cuckoo search algorithm for real parameter optimization.
Soft Comput. 2016, 20, 1389–1413. [CrossRef]
126. Chen, J.F.; Do, Q.H.; Hsieh, H.N. Training Artificial Neural Networks by a Hybrid PSO-CS Algorithm.
Algorithms 2015, 8, 292–308. [CrossRef]
127. Guo, J.; Sun, Z.; Tang, H.; Jia, X.; Wang, S.; Yan, X.; Ye, G.; Wu, G. Hybrid Optimization Algorithm of Particle
Swarm Optimization and Cuckoo Search for Preventive Maintenance Period Optimization. Discr. Dyn.
Nat. Soc. 2016, 2016, 1516271. [CrossRef]
128. Chi, R.; Su, Y.; Zhang, D.; Chi, X.X.; Zhang, H.J. A hybridization of cuckoo search and particle swarm
optimization for solving optimization problems. Neural Comput Appl. 2017. [CrossRef]
Mach. Learn. Knowl. Extr. 2019, 1 187
129. Dash, J.; Dam, B.; Swain, R. Optimal design of linear phase multi-band stop filters using improved cuckoo
search particle swarm optimization. Appl. Soft Comput. 2017, 52, 435–445. [CrossRef]
130. Shi, X.; Li, Y.; Li, H.; Guan, R.; Wang, L.; Liang, Y. An integrated algorithm based on artificial bee colony
and particle swarm optimization. In Proceedings of the 2010 Sixth International Conference on Natural
Computation (ICNC), Yantai, China, 10–12 August 2010; Volume 5, pp. 2586–2590.
131. El-Abd, M. A hybrid ABC-SPSO algorithm for continuous function optimization. In Proceedings of the 2011
IEEE Symposium on Swarm Intelligence, Paris, France, 11–15 April 2011; pp. 1–6.
132. Kıran, M.S.; Gündüz, M. A recombination-based hybridization of particle swarm optimization and artificial
bee colony algorithm for continuous optimization problems. Appl. Soft Comput. 2013, 13, 2188–2203.
[CrossRef]
133. Xiang, Y.; Peng, Y.; Zhong, Y.; Chen, Z.; Lu, X.; Zhong, X. A particle swarm inspired multi-elitist artificial bee
colony algorithm for real-parameter optimization. Comput. Optim. Appl. 2014, 57, 493–516. [CrossRef]
134. Vitorino, L.N.; Ribeiro, S.F.; Bastos-Filho, C.J. A mechanism based on Artificial Bee Colony to generate
diversity in Particle Swarm Optimization. Neurocomputing 2015, 148, 39–45. [CrossRef]
135. Lin, K.; Hsieh, Y. Classification of medical datasets using SVMs with hybrid evolutionary algorithms based
on endocrine-based particle swarm optimization and artificial bee colony algorithms. J. Med. Syst. 2015, 39,
119. [CrossRef] [PubMed]
136. Zhou, F.; Yang, Y. An Improved Artificial Bee Colony Algorithm Based on Particle Swarm Optimization and
Differential Evolution. In Intelligent Computing Theories and Methodologies: 11th International Conference, ICIC
2015; Springer International Publishing: Berlin/Heidelberg, Germany, 2015; pp. 24–35.
137. Li, Z.; Wang, W.; Yan, Y.; Li, Z. PS–ABC: A hybrid algorithm based on particle swarm and artificial bee
colony for high-dimensional optimization problems. Expert Syst. Appl. 2015, 42, 8881–8895. [CrossRef]
138. Sedighizadeh, D.; Mazaheripour, H. Optimization of multi objective vehicle routing problem using a new
hybrid algorithm based on particle swarm optimization and artificial bee colony algorithm considering
Precedence constraints. Alexandria Eng. J. 2017. [CrossRef]
139. Farmer, J.D.; Packard, N.H.; Perelson, A. The Immune System, Adaptation, and Machine Learning. Physica
D 1986, 22, 187–204. [CrossRef]
140. Bersini, H.; Varela, F.J. Hints for adaptive problem solving gleaned from immune networks. In Parallel
Problem Solving from Nature, PPSN 1990; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg,
Germany, 1991; Volume 496.
141. Forrest, S.; Perelson, A.S.; Allen, L.; Cherukuri, R. Self-Nonself Discrimination in a Computer. In Proceeding
of 1994 IEEE Symposium on Research in Security and Privacy; IEEE Computer Society Press: Los Alamos, CA,
USA, 1994.
142. Kephart, J.O. A biologically inspired immune system for computers. In Proceedings of the Artificial Life IV:
The Fourth International Workshop on the Synthesis and Simulation of Living Systems, Cambridge, MA,
USA, 6–8 July 1994; pp. 130–139.
143. Yang, X.S. A new metaheuristic bat-inspired algorithm. In Nicso 2010: Nature Inspired Cooperative Strategies;
Springer: Berlin, Germany, 2010; pp. 65–74.
144. Yang, X.S. Firefly Algorithms for Multimodal Optimization. In Stochastic Algorithms: Foundations and
Applications. SAGA 2009; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2009;
Volume 5792.
145. Krishnanand, K.N.; Ghose, D. Multimodal Function Optimization using a Glowworm Metaphor with
Applications to Collective Robotics. In Proceedings of the 2nd Indian International Conference on Artificial,
Pune, India, 20–22 December 2005; pp. 328–346.
146. Zhao, F.; Li, G.; Yang, C.; Abraham, A.; Liu, H. A human–computer cooperative particle swarm optimization
based immune algorithm for layout design. Neurocomputing 2014, 132, 68–78. [CrossRef]
147. El-Sherbiny, M.M.; Alhamali, R.M. A hybrid particle swarm algorithm with artificial immune learning for
solving the fixed charge transportation problem. Comput. Ind. Eng. 2013, 64, 610–620. [CrossRef]
148. Pan, T.S.; Dao, T.K.; Nguyen, T.T.; Chu, S.C. Hybrid Particle Swarm Optimization with Bat Algorithm.
In Genetic and Evolutionary Computing; Advances in Intelligent Systems and Computing; Springer: Cham,
Switzerland, 2015; Volume 329.
Mach. Learn. Knowl. Extr. 2019, 1 188
149. Manoj, S.; Ranjitha, S.; Suresh, H.N. Hybrid BAT-PSO optimization techniques for image registration.
In Proceedings of the 2016 International Conference on Electrical, Electronics, and Optimization Techniques
(ICEEOT), Chennai, India, 3–5 March 2016; pp. 3590–3596.
150. Xia, X.; Gui, L.; He, G.; Xie, C.; Wei, B.; Xing, Y.; Wu, R.; Tang, Y. A hybrid optimizer based on firefly algorithm
and particle swarm optimization algorithm. J. Comput. Sci. 2018, 26, 488–500. [CrossRef]
151. Arunachalam, S.; AgnesBhomila, T.; Ramesh Babu, M. Hybrid Particle Swarm Optimization Algorithm and
Firefly Algorithm Based Combined Economic and Emission Dispatch Including Valve Point Effect. In Swarm,
Evolutionary, and Memetic Computing. SEMCCO 2014; Lecture Notes in Computer Science; Springer: Cham,
Switzerland, 2015; Volume 8947.
152. Shi, Y.; Wang, Q.; Zhang, H. Hybrid ensemble PSO-GSO algorithm. In Proceedings of the 2012 IEEE 2nd
International Conference on Cloud Computing and Intelligence Systems, Hangzhou, China, 30 October–1
November 2012; pp. 114–117.
153. Liu, H.; Zhou, F. PSO algorithm based on GSO and application in the constrained optimization. In Proceedings
of the 2nd International Conference on Computer Science and Electronics Engineering (ICCSEE 2013), Advances in
Intelligent Systems Research, AISR; Atlantis Press: Paris, France, 2013; Volume 34, ISSN 1951-6851.
154. Gies, D.; Rahmat-Samii, Y. Reconfigurable array design using parallel particle swarm optimization.
In Proceedings of the Antennas and Propagation Society International Symposium, Columbus, OH, USA,
22–27 June 2003.
155. Schutte, J.F.; Reinbolt, J.A.; Fregly, B.J.; Haftka, R.T.; George, A.D. Parallel Global Optimization with the
Particle Swarm Algorithm. Int. J. Numer. Meth. Eng. 2004, 61, 2296–2315. [CrossRef] [PubMed]
156. Venter, G.; Sobieszczanski-Sobieski, J. A parallel particle swarm optimization algorithm accelerated by
asynchronous evaluations. In Proceedings of the 6th World Congresses of Structural and Multidisciplinary
Optimization, Rio de Janeiro, Brazil, 30 May–3 June 2005.
157. Chang, J.-F.; Chu, S.-C.; Roddick, J.F.; Pan, J.S. A parallel particle swarm optimization algorithm with
communication strategies. J. Inf. Sci. Eng. 2005, 21, 809–818.
158. Waintraub, M.; Schirru, R.; Pereira, C.M.N.A. Multiprocessor modeling of parallel Particle Swarm
Optimization applied to nuclear engineering problems. Prog. Nucl. Energy 2009, 51, 680–688. [CrossRef]
159. Rymut, B.; Kwolek, B. GPU-supported object tracking using adaptive appearance models and Particle Swarm
Optimization. In Proceedings of the 2010 International Conference on Computer Vision and Graphics: Part II,
ICCVG’10; Springer-Verlag: Berlin/Heidelberg, Germany, 2010; pp. 227–234.
160. Gordon, N.J.; Salmond, D.J.; Smith, A.F.M. Novel Approach to Nonlinear/Non-Gaussian Bayesian State
Estimation. IEE Proc. F Radar Signal Process. 1993, 140, 107–113. [CrossRef]
161. Zhang, J.; Pan, T.-S.; Pan, J.-S. A parallel hybrid evolutionary particle filter for nonlinear state estimation.
In Proceedings of the 2011 First International Conference on Robot, Vision and Signal Processing, Kaohsiung,
Taiwan, 21–23 November 2011; pp. 308–312.
162. Chen, R.-B.; Hsu, Y.-W.; Hung, Y.; Wang, W. Discrete particle swarm optimization for constructing uniform
design on irregular regions. Comput. Stat. Data Anal. 2014, 72, 282–297. [CrossRef]
163. Awwad, O.; Al-Fuqaha, A.; Ben Brahim, G.; Khan, B.; Rayes, A. Distributed topology control in large-scale
hybrid RF/FSO networks: SIMT GPU-based particle swarm optimization approach. Int. J. Commun. Syst.
2013, 26, 888–911. [CrossRef]
164. Qu, J.; Liu, X.; Sun, M.; Qi, F. GPU-Based Parallel Particle Swarm Optimization Methods for Graph Drawing.
Discr. Dyn. Nat. Soc. 2017, 2017, 2013673. [CrossRef]
165. Zhou, Y.; Tan, Y. GPU-based parallel particle swarm optimization. In Proceedings of the IEEE Congress on
Evolutionary Computation (CEC 2009), Trondheim, Norway, 18–21 May 2009; pp. 1493–1500.
166. Mussi, L.; Daolio, F.; Cagnoni, S. Evaluation of parallel particle swarm optimization algorithms within the
CUDA™ architecture. Inf. Sci. 2011, 181, 4642–4657. [CrossRef]
167. Parsopoulos, K.E.; Plagianakos, V.P.; Magoulas, G.D.; Vrahatis, M.N. Improving particle swarm optimizer by
function “stretching”. Nonconvex Optim. Appl. 2001, 54, 445–457.
168. Parsopoulos, K.E.; Plagianakos, V.P.; Magoulas, G.D.; Vrahatis, M.N. Stretching technique for obtaining
global minimizers through particle swarm optimization. In Proceedings of the Workshop on Particle Swarm
Optimization, Indianapolis, IN, USA, 6–7 April 2001; pp. 22–29.
Mach. Learn. Knowl. Extr. 2019, 1 189
169. Parsopoulos, K.E.; Vrahatis, M.N. Modification of the particle swarm optimizer for locating all the global
minima. In Artificial Neural Networks and Genetic Algorithms; Computer Science Series; Springer: Wien,
Germany, 2001; pp. 324–327.
170. Parsopoulos, K.E.; Vrahatis, M.N. On the computation of all global minimizers through particle swarm
optimization. IEEE Trans. Evol. Comput. 2004, 8, 211–224. [CrossRef]
171. Brits, R.; Engelbrecht, A.P.; van den Bergh, F. Solving systems of unconstrained equations using particle
swarm optimization. In Proceedings of the IEEE 2002 Conference on Systems, Man, and Cybernetics,
Yasmine Hammamet, Tunisia, 6–9 October 2002.
172. Brits, R.; Engelbrecht, A.P.; van den Bergh, F. A niching particle swarm optimizer. In Proceedings of the 4th
Asia-Pacific Conference on Simulated Evolution and Learning (SEAL’02), Singapore, 18–22 November 2002;
Volume 2, pp. 692–696.
173. Li, X. Adaptively choosing neighbourhood bests using species in a particle swarm optimizer for multimodal
function optimization. In GECCO 2004. LNCS; Springer: Heidelberg, Germany, 2004; Volume 3102,
pp. 105–116.
174. Bird, S. Adaptive Techniques for Enhancing the Robustness and Performance of Speciated Psos in Multimodal
Environments. Ph.D. Thesis, RMIT University, Melbourne, Australia, 2008.
175. Bird, S.; Li, X. Adaptively choosing niching parameters in a PSO. In Proceedings of the Genetic and
Evolutionary Computation Conference, GECCO 2006, Seattle, WA, USA, 8–12 July 2006; Cattolico, M., Ed.;
ACM: New York, NY, USA, 2006; pp. 3–10.
176. Li, X. Multimodal function optimization based on fitness-euclidean distance ratio. In Proceedings of the
Genetic and Evolutionary Computation Conference (GECCO 2007), London, UK, 7–11 July 2007; pp. 78–85.
177. Kennedy, J. Stereotyping: Improving particle swarm performance with cluster analysis. In Proceedings of
the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512), La Jolla, CA, USA, 16–19 July
2000; pp. 303–308.
178. Passaro, A.; Starita, A. Particle swarm optimization for multimodal functions: A clustering approach. J. Artif.
Evol. Appl. 2008, 1–15. [CrossRef]
179. Schwarz, G. Estimating the dimension of a model. Ann. Stat. 1978, 6, 461–464. [CrossRef]
180. Blackwell, T.M.; Branke, J. Multi-swarm optimization in dynamic environments. In EvoWorkshops 2004.
LNCS; Raidl, G.R., Cagnoni, S., Branke, J., Corne, D.W., Drechsler, R., Jin, Y., Johnson, C.G., Machado, P.,
Marchiori, E., Rothlauf, F., et al., Eds.; Springer: Heidelberg, Germany, 2004; Volume 3005, pp. 489–500.
181. Bird, S.; Li, X. Using regression to improve local convergence. In Proceedings of the 2007 IEEE Congress on
Evolutionary Computation, Singapore, 25–28 September 2007; pp. 1555–1562.
182. Parrott, D.; Li, X. Locating and tracking multiple dynamic optima by a particle swarm model using speciation.
IEEE Trans. Evol. Comput. 2006, 10, 440–458. [CrossRef]
183. Li, X. Niching without niching parameters: Particle swarm optimization using a ring topology. IEEE Trans.
Evol. Comput. 2010, 14, 150–169. [CrossRef]
184. Afshinmanesh, F.; Marandi, A.; Rahimi-Kian, A. A novel binary particle swarm optimization method
using artificial immune system. In Proceedings of the EUROCON 2005—The International Conference on
“Computer as a Tool”, Belgrade, Serbia, 21–24 November 2005; pp. 217–220.
185. Deligkaris, K.V.; Zaharis, Z.D.; Kampitaki, D.G.; Goudos, S.K.; Rekanos, I.T.; Spasos, M.N. Thinned planar
array design using Boolean PSO with velocity mutation. IEEE Trans. Magn. 2009, 45, 1490–1493. [CrossRef]
186. Chen, W.; Zhang, J.; Chung, H.; Zhong, W.; Wu, W.; Shi, Y. A novel set-based particle swarm optimization
method for discrete optimization problems. IEEE Trans. Evol. Comput. 2010, 14, 278–300. [CrossRef]
187. Gong, Y.; Zhang, J.; Liu, O.; Huang, R.; Chung, H.; Shi, Y. Optimizing vehicle routing problem with time
windows: A discrete particle swarm optimization approach. IEEE Trans. Syst. Man Cybern. 2012, 42, 254–267.
[CrossRef]
188. Solomon, M. Algorithms for the vehicle routing and scheduling problems with time window constraints.
Oper. Res. 1987, 35, 254–265. [CrossRef]
189. Kitayama, S.; Arakawa, M.; Yamazaki, K. Penalty function approach for the mixed discrete nonlinear
problems by particle swarm optimization. Struct. Multidiscip. Optim. 2006, 32, 191–202. [CrossRef]
190. Nema, S.; Goulermas, J.; Sparrow, G.; Cook, P. A hybrid particle swarm branch-and-bound (HPB) optimizer
for mixed discrete nonlinear programming. IEEE Trans. Syst. Man Cybern. Part A 2008, 38, 1411–1424.
[CrossRef]
Mach. Learn. Knowl. Extr. 2019, 1 190
191. Sun, C.; Zeng, J.; Pan, J.; Zhang, Y. PSO with Constraint-Preserving Mechanism for Mixed-Variable
Optimization Problems. In Proceedings of the 2011 First International Conference on Robot, Vision and
Signal Processing, Kaohsiung, Taiwan, 21–23 November 2011; pp. 149–153.
192. Chowdhury, S.; Zhang, J.; Messac, A. Avoiding premature convergence in a mixed-discrete particle swarm
optimization (MDPSO) algorithm. In Proceedings of the 53rd AIAA/ASME/ASCE/AHS/ASC Structures,
Structural Dynamics, and Materials Conference, Honolulu, HI, USA, 23–26 April 2012. No. AIAA 2012-1678.
193. Laskari, E.; Parsopoulos, K.; Vrahatis, M. Particle swarm optimization for integer programming.
In Proceedings of the IEEE Congress on Evolutionary Computation. CEC’02 (Cat. No.02TH8600), Honolulu,
HI, USA, 12–17 May 2002; Volume 2, pp. 1582–1587.
194. Yare, Y.; Venayagamoorthy, G.K. Optimal Scheduling of Generator Maintenance Using Modified Discrete
Particle Swarm Optimization. In Proceedings of the Symposium on Bulk Power System Dynamics and
Control—VII. Revitalizing Operational Reliability, 2007 iREP, Institute of Electrical and Electronics Engineers
(IEEE), Charleston, SC, USA, 19–24 August 2007.
195. Eajal, A.A.; El-Hawary, M.E. Optimal capacitor placement and sizing in unbalanced distribution systems
with harmonics consideration using particle swarm optimization. IEEE Trans. Power Del. 2010, 25, 1734–1741.
[CrossRef]
196. Phung, M.D.; Quach, C.H.; Dinh, T.H.; Ha, Q. Enhanced discrete particle swarm optimization path planning
for UAV vision-based surface inspection. Autom. Constr. 2017, 81, 25–33. [CrossRef]
197. Gong, M.G.; Yan, J.N.; Shen, B.; Ma, L.J.; Cai, Q. Influence maximization in social networks based on discrete
particle swarm optimization. Inform. Sci. 2016, 367–368, 600–614. [CrossRef]
198. Aminbakhsh, S.; Sonmez, R. Discrete particle swarm optimization method for the large-scale discrete
time–cost trade-off problem. Expert Syst. Appl. 2016, 51, 177–185. [CrossRef]
199. Li, L.; Jiao, L.; Zhao, J.; Shang, R.; Gong, M. Quantum-behaved discrete multi-objective particle swarm
optimization for complex network clustering. Pattern Recogit. 2017, 63, 1–14. [CrossRef]
200. Girvan, M.; Newman, M.E.J. Community structure in social and biological networks. Proc. Natl. Acad. Sci.
USA 2002, 99, 7821–7826. [CrossRef] [PubMed]
201. Ates, A.; Alagoz, B.B.; Kavuran, G.; Yeroglu, C. Implementation of fractional order filters discretized
by modified Fractional Order Darwinian Particle Swarm Optimization. Measurement 2017, 107, 153–164.
[CrossRef]
202. Du, W.; Li, B. Multi-strategy ensemble particle swarm optimization for dynamic optimization. Inf. Sci. 2008,
178, 3096–3109. [CrossRef]
203. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1,
67–82. [CrossRef]
204. Engelbrecht, A.P. Heterogeneous particle swarm optimization. In Swarm Intelligence; Springer:
Berlin/Heidelberg, Germany, 2010; pp. 191–202.
205. Lynn, N.; Suganthan, P.N. Ensemble particle swarm optimizer. Appl. Soft Comput. 2017, 55, 533–548.
[CrossRef]
206. Shirazi, M.Z.; Pamulapati, T.; Mallipeddi, R.; Veluvolu, K.C. Particle Swarm Optimization with Ensemble of
Inertia Weight Strategies. In Advances in Swarm Intelligence. ICSI 2017; Lecture Notes in Computer Science;
Springer: Cham, Switzerland, 2017; Volume 10385.
207. Lynn, N.; Suganthan, P.N. Heterogeneous comprehensive learning particle swarm optimization with
enhanced exploration and exploitation. Swarm Evol. Comput. 2015, 24, 11–24. [CrossRef]
208. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [CrossRef]
209. Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179,
2232–2248. [CrossRef]
210. Mirjalili, S.; Hashim, S.Z.M. A new hybrid PSOGSA algorithm for function optimization. In Proceedings of
the 2010 International Conference on Computer and Information Application, Tianjin, China, 3–5 December
2010; pp. 374–377.
211. Sergeyev, Y.D.; Kvasov, D.E.; Mukhametzhanov, M.S. On the efficiency of nature-inspired metaheuristics in
expensive global optimization with limited budget. Sci. Rep. 2018, 8. [CrossRef] [PubMed]
212. Kvasov, E.D.; Mukhametzhanov, M.S. Metaheuristic vs. deterministic global optimization algorithms: The
univariate case. Appl. Math. Comput. 2018, 318, 245–259. [CrossRef]
Mach. Learn. Knowl. Extr. 2019, 1 191
213. Kvasov, D.E.; Mukhametzhanov, M.S. One-dimensional global search: Nature-inspired vs. lipschitz methods.
AIP Conf. Proc. 2016, 1738, 400012.
214. Gaviano, M.; Kvasov, D.E.; Lera, D.; Sergeyev, Y.D. Algorithm 829: Software for generation of classes of test
functions with known local and global minima for global optimization. ACM Trans. Math. Softw. 2003, 29,
469–480. [CrossRef]
215. Sergeyev, Y.D.; Kvasov, D.E. Global search based on efficient diagonal partitions and a set of Lipschitz
constants. SIAM J. Optim. 2006, 16, 910–937. [CrossRef]
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).