Abstract
Grammatical Swarm (GS), which is one of the evolutionary computations, is designed to find the function, the program or the program segment satisfying the design objective. Since the candidate solutions are defined as the bit-strings, the use of the translation rules translates the bit-strings into the function or the program. The swarm of particles is evolved according to Particle Swarm Optimization (PSO) in order to find the better solution. The aim of this study is to improve the convergence property of GS by changing the traditional PSO in GS with the other PSOs such as Particle Swarm Optimization with constriction factor, Union of Global and Local Particle Swarm Optimizations, Comprehensive Learning Particle Swarm Optimization, Particle Swarm Optimization with Second Global best Particle and Particle Swarm Optimization with Second Personal best Particle. The improved GS algorithms, therefore, are named as Grammatical Swarm with constriction factor (GS-cf), Union of Global and Local Grammatical Swarm (UGS), Comprehensive Learning Grammatical Swarm (CLGS), Grammatical Swarm with Second Global best Particle (SG-GS) and Grammatical Swarm with Second Personal best Particle (SG-GS), respectively. Symbolic regression problem is considered as the numerical example. The original GS is compared with the other algorithms. The effect of the model parameters for the convergence properties of the algorithms are discussed in the preliminary experiments. Then, except for CLGS and UGS, the convergence speeds of the other algorithms are faster than that of the original GS. Especially, the convergence properties of GS-cf and SP-GS are fastest among them.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Schwefel, H.-P.: Evolution and Optimum Seeking. Wiley, Hoboken (1995)
Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: Proceedings of IEEE the International Conference on Neural Networks, pp. 1942–1948 (1995)
Hollan, J.H.: Adaptation in Natural and Artificial Systems. The University of Michigan Press, Ann Arbor (1975)
Koza, J.R.: Genetic Programming: On the Programming of Computers by Means of Natural Selection. MIT Press, Cambridge (1992)
Ryan, C., Collins, J.J., O’Neill, M.: Grammatical evolution: evolving programs for an arbitrary language. In: Banzhaf, W., Poli, R., Schoenauer, M., Fogarty, T.C. (eds.) EuroGP 1998. LNCS, vol. 1391, pp. 83–96. Springer, Heidelberg (1998). doi:10.1007/BFb0055930
Ryan, C., O’Neill, M.: Grammatical Evolution: Evolutionary Automatic Programming in an Arbitrary Language. Springer, Heidelberg (2003). doi:10.1007/978-1-4615-0447-4
O’Neill, M., Brabazon, A.: Grammatical swarm. In: Deb, K. (ed.) GECCO 2004. LNCS, vol. 3102, pp. 163–174. Springer, Heidelberg (2004). doi:10.1007/978-3-540-24854-5_15
Clerc, M., Kennedy, J.: The particle swarm-explosion, stability, and convergence in a multidimensional complex space. IEEE Trans. Evol. Comput. 6, 58–73 (2002)
Parsopoulos, K.E., Vrahatis, M.N.: UPSO: a unified particle swarm optimization scheme. Lect. Ser. Comput. Comput. Sci. 1, 868–873 (2004)
Liang, J.J., Qin, A.K., Suganthan, P.N., Baskar, S.: Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans. Evol. Comput. 10, 281–295 (2006)
Shin, Y.B., Kita, E.: Search performance improvement of particle swarm optimization by second best particle information. Appl. Math. Comput. 246, 346–354 (2014)
Eberhart, R.C., Shi, Y.: Evolving artificial neural networks. In: Proceedings of 1998 International Conference on Neural Networks and Brain, pp. 1423–1447 (1998)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Kita, E., Yamamoto, R., Sugiura, H., Zuo, Y. (2017). Application of Grammatical Swarm to Symbolic Regression Problem. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, ES. (eds) Neural Information Processing. ICONIP 2017. Lecture Notes in Computer Science(), vol 10637. Springer, Cham. https://doi.org/10.1007/978-3-319-70093-9_37
Download citation
DOI: https://doi.org/10.1007/978-3-319-70093-9_37
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-70092-2
Online ISBN: 978-3-319-70093-9
eBook Packages: Computer ScienceComputer Science (R0)