Abstract
The paper introduces a novel 2-Stage model for multi-classifier system. Instead of gathering posterior probabilities resulted from base classifiers into a single dataset called meta-data or Level1 data like in the original 2-Stage model, here we separate data in K Level1 matrices corresponding to the K base classifiers. These data matrices, in turn, are classified in sequence by a new classifier at the second stage to generate output of that new classifier called Level2 data. Next, Weight Matrix algorithm is proposed to combine Level2 data and produces prediction for unlabeled observations. Experimental results on CLEF2009 medical image database demonstrate the benefit of our model in comparison with several existing ensemble learning models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Duin, R.P.W.: The Combining Classifier: To Train or Not to Train? In: Proceedings of the 16th International Conference on Pattern Recognition, vol. 2, pp. 765–770 (2002)
Izenman, A.J.: Modern Multivariate Statistical Techniques, ch. 14. Springer, New York (2008)
Kuncheva, L.I., Bezdek, J.C., Duin, R.P.W.: Decision Templates for Multi Classifier Fusion: An Experimental Comparison. Pattern Recognition 34(2) 299–314 (2001)
Kuncheva, L.I.: A theoretical Study on Six Classifier Fusion Strategies. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(2) (2002)
Ting, K.M., Witten, I.H.: Issues in Stacked Generation. Journal of Artificial In Intelligence Research 10, 271–289 (1999)
Kittler, J., Hatef, M., Duin, R.P.W., Matas, J.: On Combining Classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(3) (1998)
Merz, C.: Using Correspondence Analysis to Combine Classifiers. Machine Learning 36, 33–58 (1999)
Todorovski, L., Džeroski, S.: Combining Classifiers with Meta Decision Trees. Machine Learning 50, 223–249 (2003)
Džeroski, S., Ženko, B.: Is Combining Classifiers with Stacking Better than Selecting the Best One? Machine Learning 54, 255–273 (2004)
Benediktsson, J.A., Kanellopoulos, I.: Classification of Multisource and Hyperspectral Data Based on Decision Fusion. IEEE Transactions on Geoscience and Remote Sensing 37(3) (May 1999)
Lepisto, L., Kunttu, I., Autio, J., Visa, A.: Classification of Non-Homogeneous Texture Images by Combining Classifier. In: Proceedings International Conference on Image Processing, vol. 1, pp. 981–984 (2003)
Sen, M.U., Erdogan, H.: Linear classifier combination and selection using group sparse regularization and hinge loss. Pattern Recognition Letters 34, 265–274 (2013)
Chen, Y.-S., Hung, Y.-P., Yen, T.-F., Fuh, C.-S.: Fast and versatile algorithm for nearest neighbor search based on a lower bound tree. Pattern Recognition 40, 360–375 (2007)
Seeward, A.K.: How to Make Stacking Better and Faster While Also Taking Care of an Unknown Weakness? In: Proceedings of the Nineteenth International Conference on Machine Learning, pp. 554–561 (2002)
Ko, B.C., Kim, S.H., Nam, J.Y.: X-ray Image Classification Using Random Forests with Local Wavelet-Based CS-Local Binary Pattern. J. Digital Imaging 24, 1141–1151 (2011)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Nguyen, T.T., Liew, A.WC., Tran, M.T., Nguyen, T.T.T., Nguyen, M.P. (2014). Fusion of Classifiers Based on a Novel 2-Stage Model. In: Wang, X., Pedrycz, W., Chan, P., He, Q. (eds) Machine Learning and Cybernetics. ICMLC 2014. Communications in Computer and Information Science, vol 481. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-45652-1_7
Download citation
DOI: https://doi.org/10.1007/978-3-662-45652-1_7
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-45651-4
Online ISBN: 978-3-662-45652-1
eBook Packages: Computer ScienceComputer Science (R0)