Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1569901.1569910acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Particle swarm optimization based multi-prototype ensembles

Published: 08 July 2009 Publication History

Abstract

This paper proposes and evaluates a Particle Swarm Optimization (PSO) based ensemble classifier. The members of the ensemble are Nearest Prototype Classifiers generated sequentially using PSO and combined by a majority voting mechanism. Two necessary requirements for good performance of an ensemble are accuracy and diversity of error. Accuracy is achieved by PSO minimizing a fitness function representing the error rate as the members are created. The diversity of error is promoted by using a different initialization of PSO each time to create a new member and by adopting decorrelated training where a penalty term is added to the fitness function to penalize particles that make the same errors as previously generated classifiers. Simulation experiments on different classification problems show that the ensemble has better performance than a single classifier and are effective in generating diverse ensemble members.

References

[1]
UCI repository of machine learning databases. http://archive.ics.uci.edu/ml/datasets.html.
[2]
R. Banfield, L. Hall, K. Bowyer, and W. Kegelmeyer. A comparison of decision tree ensemble creation techniques. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(1):173--180, 2007.
[3]
J. Bezdek, T. Reichherzer, G. Lim, and Y. Attikiouzel. Multiple-prototype classifier design. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, 28(1):67--79, 1998.
[4]
L. Breiman. Bagging predictors. Machine Learning, 24(2):123--140, 1996.
[5]
G. Brown and J. Wyatt. The use of the ambiguity decomposition in neural network ensemble learning methods. In T. Fawcett, et al., editors, ICML'03, pages 67--74, 2003.
[6]
G. Brown, J. Wyatt, R. Harris, and X. Yao. Diversity creation methods: a survey and categorisation. Information Fusion, 6(1):5 -- 20, 2005.
[7]
D. W. Bunn. Statistical efficiency in the linear combination of forecasts. International Journal of Forecasting, 1(2):151--163, 1985.
[8]
D. W. Bunn. Forecasting with more than one model. Journal of Forecasting, 8(3):161--166, 1989.
[9]
A. Cervantes, I. Galvan, and P. Isasi. Building nearest prototype classifiers using a Michigan approach PSO. IEEE Swarm Intelligence Symposium (SIS 2007), pages 135--140, 2007.
[10]
I. De Falco, A. Della Cioppa, and E. Tarantino. Facing classification problems with particle swarm optimization. Appl. Soft Comput., 7(3):652--658, 2007.
[11]
F. Fernandez and P. Isasi. Evolutionary design of nearest prototype classifiers. Journal of Heuristics, 10(4):431--454, 2004.
[12]
Y. Freund and R. E. Schapire. Experiments with a new boosting algorithm. In Proceedings of the Thirteenth International Conference on Machine Learning, pages 148--156. Morgan Kaufmann, 1996.
[13]
C. Gagne, M. Sebag, M. Schoenauer, and M. Tomassini. Ensemble learning for free with evolutionary algorithms? In Proceedings of the 9th annual conference on Genetic and evolutionary computation, pages 1782--1789. ACM, 2007.
[14]
L. Hansen and P. Salamon. Neural network ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence, 12(10):993--1001, 1990.
[15]
S. Hashem, B. Schmeiser, and Y. Yih. Optimal linear combinations of neural networks. Neural Networks, 10:599--614, 1997.
[16]
T. K. Ho. The random subspace method for constructing decision forests. IEEE Trans. Pattern Anal. Mach. Intell., 20(8):832--844, 1998.
[17]
J. Holland. Escaping brittleness: the possibilities of general purpose learning algorithms applied to parallel rule-based systems. In Machine Learning: An Artificial Intelligence Approach, pages 129--138. Morgan Kauffman, 1986.
[18]
W. Jian, Y.-C. Xue, and J.-X. Qian. An improved particle swarm optimization algorithm with neighborhoods topologies. In Proceedings of 2004 International Conference on Machine Learning and Cybernetics, volume 4, pages 2332--2337, 2004.
[19]
J. Kennedy and R. C. Eberhart. Particle swarm optimization. In Proc. of the IEEE Int. Conf. on Neural Networks, pages 1942--1948, 1995.
[20]
H.-C. Kim, S. Pang, H.-M. Je, D. Kim, and S. Y. Bang. Constructing support vector machine ensemble. Pattern Recognition, 36(12):2757--2767, 2003.
[21]
L. Knight and S. Sen. Please: a prototype learning system using genetic algorithms. In Proceedings of the Sixth International Conference on Genetic Algorithms, pages 429--435. Morgan Kaufmann, 1995.
[22]
R. Kohavi. A study of cross-validation and bootstrap for accuracy estimation and model selection. In International Joint Conference on Artificial Intelligence (IJCAI), pages 1137--1145, 1995.
[23]
A. Krogh and J. Vedelsby. Neural network ensembles, cross validation, and active learning. In Advances in Neural Information Processing Systems, pages 231--238, 1995.
[24]
L. I. Kuncheva and L. C. Jain. Designing classifier fusion systems by genetic algorithms. IEEE Transactions On Evolutionary Computation, 4:327--336, 2000.
[25]
L. Lam and S. Suen. Application of majority voting to pattern recognition: an analysis of its behavior and performance. IEEE Transactions on Systems, Man and Cybernetics, Part A, 27(5):553--568, 1997.
[26]
Y. Liu and X. Yao. Ensemble learning via negative correlation. Neural Networks, 12:1399--1404, 1999.
[27]
C. Maurice. The swarm and queen: towards a deterministic and adaptive particle swarm optimization. In IEEE Congress on Evolutionary Computation, volume 2, pages 1951--1957, 1999.
[28]
B. Minaei-Bidgoli, G. Kortemeyer, and W. F. Punch. Optimizing classification ensembles via a genetic algorithm for a web-based educational system. In Lecture Notes in Computer Science, volume 3138, pages 397--406, 2004.
[29]
D. Opitz and R. Maclin. Popular ensemble methods: an empirical study. Journal of Artificial Intelligence Research, 11:169--198, 1999.
[30]
D. W. Opitz and J. W. Shavlik. A genetic algorithm approach for creating neural network ensembles. In Combining Artificial Neural Nets, pages 79--99. Springer-Verlag, 1999.
[31]
B. E. Rosen. Ensemble learning using decorrelated neural networks. Connection Science, 8:373--384, 1996.
[32]
S. F. Smith. A learning system based on genetic adaptive algorithms. PhD thesis, University of Pittsburgh, USA, 1980.
[33]
J. Trejos-Zelaya and M. Villalobos-Arias. Partitioning by particle swarm optimization. In P. Brito, et al., editors, Selected Contributions in Data Analysis and Classification. Studies in Classification, Data Analysis, and Knowledge Organization, chapter 22, pages 235--244. Springer, 2007.
[34]
L. Xu, A. Krzyzak, and C. Suen. Methods of combining multiple classifiers and their applications to handwriting recognition. IEEE Transactions on Systems, Man and Cybernetics, 22(3):418--435, 1992.
[35]
X. Yao and Y. Liu. Making use of population information in evolutionary artificial neural networks. IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics, 28:417--425, 1998.
[36]
Z.-H. Zhou, J.-X. Wu, Y. Jiang, and S.-F. Chen. Genetic algorithm based selective neural network ensemble. In Proc. of IJCAI, pages 797--802, 2001.

Cited By

View all
  • (2022)Smart ensemble machine learner with hyperparameter-free for predicting bond capacity of FRP-to-concrete interface: Multi-national dataConstruction and Building Materials10.1016/j.conbuildmat.2022.128158345(128158)Online publication date: Aug-2022
  • (2019)A novel ensemble algorithm for biomedical classification based on Ant Colony OptimizationApplied Soft Computing10.1016/j.asoc.2011.03.02511:8(5674-5683)Online publication date: 21-Nov-2019
  • (2011)A PSO algorithm for improving multi-view classification2011 IEEE Congress of Evolutionary Computation (CEC)10.1109/CEC.2011.5949717(925-932)Online publication date: Jun-2011

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '09: Proceedings of the 11th Annual conference on Genetic and evolutionary computation
July 2009
2036 pages
ISBN:9781605583259
DOI:10.1145/1569901
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 July 2009

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. classification
  2. ensemble
  3. particle swarm optimization

Qualifiers

  • Research-article

Conference

GECCO09
Sponsor:
GECCO09: Genetic and Evolutionary Computation Conference
July 8 - 12, 2009
Québec, Montreal, Canada

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)0
Reflects downloads up to 22 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2022)Smart ensemble machine learner with hyperparameter-free for predicting bond capacity of FRP-to-concrete interface: Multi-national dataConstruction and Building Materials10.1016/j.conbuildmat.2022.128158345(128158)Online publication date: Aug-2022
  • (2019)A novel ensemble algorithm for biomedical classification based on Ant Colony OptimizationApplied Soft Computing10.1016/j.asoc.2011.03.02511:8(5674-5683)Online publication date: 21-Nov-2019
  • (2011)A PSO algorithm for improving multi-view classification2011 IEEE Congress of Evolutionary Computation (CEC)10.1109/CEC.2011.5949717(925-932)Online publication date: Jun-2011

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media