Abstract
Although ordering-based pruning algorithms possess relatively high efficiency, there remains room for further improvement. To this end, this paper describes the combination of a dynamic programming technique with the ensemble-pruning problem. We incorporate dynamic programming into the classical ordering-based ensemble-pruning algorithm with complementariness measure (ComEP), and, with the help of two auxiliary tables, propose a reasonably efficient dynamic form, which we refer to as ComDPEP. To examine the performance of the proposed algorithm, we conduct a series of simulations on four benchmark classification datasets. The experimental results demonstrate the significantly higher efficiency of ComDPEP over the classic ComEP algorithm. The proposed ComDPEP algorithm also outperforms two other state-of-the-art ordering-based ensemble-pruning algorithms, which use uncertainty weighted accuracy and reduce-error pruning, respectively, as their measures. It is noteworthy that, the effectiveness of ComDPEP is just the same with that of the classical ComEP algorithm.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Martinez-Munoz G, Hernandez-Lobato D, Suarez A (2009) An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Trans Pattern Anal Machine Intell 31:245–259
Dai Q, Liu Z (2013) ModEnPBT: a modified backtracking ensemble pruning algorithm. Appl Soft Comput 13:4292–4302
Margineantu D, Dietterich T (1997) Pruning adaptive boosting. In: Proceedings of the 14th international conference on machine learning
Prodromidis AL, Stolfo SJ (2001) Cost complexity-based pruning of ensemble classifiers. Knowl Inf Syst 3:449–469
Partalas I, Tsoumakas G, Vlahavas I (2010) An ensemble uncertainty aware measure for directed hill climbing ensemble pruning. Mach Learn 81:257–282
Martinez-Munoz G, Suarez A (2004) Aggregation ordering in bagging. In: International conference on artificial intelligence and applications
Xu L, Li B, Chen E (2012) Ensemble pruning via constrained eigen-optimization. In: IEEE 12th international conference on data mining (ICDM), Brussels
Zhou Z-H (2012) Ensemble methods: foundations and algorithms. Chapman and Hall/CRC
Tsoumakas G, Partalas I, Vlahavas I (2009) Applications of supervised and unsupervised ensemble methods, vol 245. Springer, Berlin
Partridge D, Yates WB (1996) Engineering multiversion neural-net systems. Neural Comput 8:869–893
Yang Y, Korb K, Ting K, Webb G (2005) Ensemble selection for superparent-one-dependence estimators. In: 2005: advances in artificial intelligence
Tsoumakas G, Partalas I (2009) An ensemble pruning primer. In: Applications of supervised and unsupervised ensemble methods. Studies in computational intelligence, vol 245, pp 1–13
Martinez-Munoz G, Suarez A (2006) Pruning in ordered bagging ensembles. In: 23rd international conference in machine learning
Giacinto G, Roli F, Fumera G (2000) Design of effective multiple classifier systems by clustering of classifiers. In: 15th international conference on pattern recognition
Kuncheva L, Whitaker CJ (2003) Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach Learn 51:181–207
Fu Q, Hu SX, Zhao SY (2005) Clustering-based selective neural network ensemble. J Zhejiang Univ Sci 6A:387–392
Lazarevic A, Obradovic Z (2001) The effective pruning of neural network classifiers. In: 2001 IEEE/INNS international conference on neural networks, IJCNN
Zhou Z, Tang W (2003) Selective ensemble of decision trees. In: Proceedings of the 9th international conference on rough sets, fuzzy sets, data mining, and granular computing, Chongqing, China
Zhang Y, Burer S, Street WN (2006) Ensemble pruning via semi-definite programming. J Mach Learn Res 7:1315–1338
Caruana R, Niculescu-Mizil A, Crew G, Ksikes A (2004) Ensemble selection from libraries of models. In: Proceedings of the 21st international conference on machine learning
Banfield RE, Hall LO, Bowyer KW, Kegelmeyer WP (2005) Ensemble diversity measures and their application to thinning. Information Fusion 6:49–62
Martinez-Munoz G, Suarez A (2007) Using boosting to prune bagging ensembles. Pattern Recogn Lett 28:156–165
Partalas I, Tsoumakas G, Vlahavas I (2009) Pruning an ensemble of classifiers via reinforcement learning. Neurocomputing 72:1900–1909
Alsuwaiyel MH (2003) Algorithms design techniques and analysis. World Scientific, Singapore
Dai Q, Liu NZ (2011) The build of n-Bits binary coding ICBP ensemble system. Neurocomputing 74:3509–3519
Dai Q, Chen SC, Zhang BZ (2003) Improved CBP neural network model with applications in time series prediction. Neural Process Lett 18:197–211
Dai Q (2013) A novel ensemble pruning algorithm based on randomized greedy selective strategy and ballot. Neurocomputing 122:258–265
Dai Q (2013) A competitive ensemble pruning approach based on cross-validation technique. Knowl-Based Syst 37:394–414
Dai Q (2013) An efficient ensemble pruning algorithm using one-path and two-trips searching approach. Knowl-Based Syst 51:85–92
Dai Q, Zhang T, Liu N (2015) A new reverse reduce-error ensemble pruning algorithm. Appl Soft Comput 28:237–249
Liu Z, Dai Q, Liu N (2014) Ensemble selection by GRASP. Appl Intell 41:128–144
Dai Q, Li M (2015) Introducing randomness into greedy ensemble pruning algorithms. Appl Intell 42:406–429
http://www.ics.uci.edu/~mlearn/MLRepository.html or ftp.ics.uci.edu:pub/machine-learning-databases
Haykin S (1999) Neural networks a comprehensive foundation. Prentice-Hall, Englewood Cliffs
Huang G-B, Zhu Q-Y, Siew C-K (2006) Real-time learning capability of neural networks. IEEE Trans Neural Netw 17:863–878
Acknowledgments
This work is supported by the National Natural Science Foundation of China under the Grant no. 61473150.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare that they have no conflict of interest.
Informed consent
Informed consent was obtained from all individual participants included in the study.
Additional informed consent was obtained from all individual participants for whom identifying information is included in this article.
Additional information
Research involving Human Participants and/or Animals
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
For this type of study formal consent is not required.
All applicable international, national, and/or institutional guidelines for the care and use of animals were followed.
Appendices
Appendix A: Formal procedure of the ComEP algorithm
The formal procedure of the ComEP algorithm is as follows.
Appendix B: Detailed computational rules for Table 2
The detailed computational rules for Table 2, viz. ClassVoteCounts, are as follows.
Appendix C: Formal procedure of the ComDPEP algorithm
The formal procedure of the ComDPEP algorithm is as follows.
Rights and permissions
About this article
Cite this article
Dai, Q., Han, X. An efficient ordering-based ensemble pruning algorithm via dynamic programming. Appl Intell 44, 816–830 (2016). https://doi.org/10.1007/s10489-015-0729-z
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-015-0729-z