Abstract
In this paper we present an approach to generate ensemble of classifiers using non–uniform layered clustering. In the proposed approach the dataset is partitioned into variable number of clusters at different layers. A set of base classifiers is trained on the clusters at different layers. The decision on a pattern at each layer is obtained from the classifier trained on the nearest cluster and the decisions from the different layers are fused using majority voting to obtain the final verdict. The proposed approach provides a mechanism to obtain the optimal number of layers and clusters using a Genetic Algorithm. Clustering identifies difficult–to–classify patterns and layered non–uniform clustering approach brings in diversity among the base classifiers at different layers. The proposed method performs relatively better than the other state–of–art ensemble classifier generation methods as evidenced from the experimental results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Tang, E.K., Suganthan, P.N., Yao, X.: An analysis of diversity measures. Machine Learning 65, 247–271 (2006)
Kuncheva, L.I., Whitaker, C.J.: Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Machine Learning 51(2), 181–207 (2003)
Scherbart, A., Nattkemper, T.W.: The diversity of regression ensembles combining bagging and random subspace method. In: Köppen, M., Kasabov, N., Coghill, G. (eds.) ICONIP 2008. LNCS, vol. 5507, pp. 911–918. Springer, Heidelberg (2009)
Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)
Chen, L., Kamel, M.S.: A generalized adaptive ensemble generation and aggregation approach for multiple classifiers systems. Pattern Recognition 42, 629–644 (2009)
Nanni, L., Lumini, A.: Fuzzy bagging: a novel ensemble of classifiers. Pattern Recognition 39, 488–490 (2006)
Schapire, R.E.: The strength of weak learnability. Machine Learning 5(2), 197–227 (1990)
Freund, Y., Schapire, R.E.: Decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)
Muhlbaier, M.D., Topalis, A., Polikar, R.: Learn++. NC: Combining ensemble of classifiers with dynamically weighted consult-and-vote for efficient incremental learning of new classes. IEEE Trans. on Neural Networks 20(1), 152–168 (2009)
Nascimento, D.S.C., Coelho, A.L.V.: Ensembling heterogeneous learning models with boosting. In: Leung, C.S., Lee, M., Chan, J.H. (eds.) ICONIP 2009. LNCS, vol. 5863, pp. 512–519. Springer, Heidelberg (2009)
Rokach, L., Maimon, O., Lavi, I.: Space decomposition in data mining: a clustering approach. In: Int. Symp. on Methodologies for Intelligent Systems, Maebashi, Japan, pp. 24–31 (2003)
Kuncheva, L.I.: Switching between selection and fusion in combining classifiers: an experiment. IEEE Trans on Systems, Man and Cybernetics 32(2), 146–156 (2002)
Nasierding, G., Tsoumakas, G., Kouzani, A.Z.: Clustering based multi-label classification for image annotation and retrieval. In: IEEE Int. Conf. on Systems, Man and Cybernetics, pp. 4514–4519 (2009)
Mulder, W.D., Schliebs, S., Boel, R., Kuiper, M.: Initialization dependence of clustering algorithms. In: Köppen, M., Kasabov, N., Coghill, G. (eds.) ICONIP 2008. LNCS, vol. 5507, pp. 615–622. Springer, Heidelberg (2009)
UCI Machine Learning Database, http://archive.ics.uci.edu/ml/
Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.H.: The WEKA data mining software: an update. SIGKDD Explorations 11(1) (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Rahman, A., Verma, B., Yao, X. (2010). Non–uniform Layered Clustering for Ensemble Classifier Generation and Optimality. In: Wong, K.W., Mendis, B.S.U., Bouzerdoum, A. (eds) Neural Information Processing. Theory and Algorithms. ICONIP 2010. Lecture Notes in Computer Science, vol 6443. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-17537-4_67
Download citation
DOI: https://doi.org/10.1007/978-3-642-17537-4_67
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-17536-7
Online ISBN: 978-3-642-17537-4
eBook Packages: Computer ScienceComputer Science (R0)