Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Fusion of Classifiers Based on a Novel 2-Stage Model

  • Conference paper
  • First Online:
Machine Learning and Cybernetics (ICMLC 2014)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 481))

Included in the following conference series:

Abstract

The paper introduces a novel 2-Stage model for multi-classifier system. Instead of gathering posterior probabilities resulted from base classifiers into a single dataset called meta-data or Level1 data like in the original 2-Stage model, here we separate data in K Level1 matrices corresponding to the K base classifiers. These data matrices, in turn, are classified in sequence by a new classifier at the second stage to generate output of that new classifier called Level2 data. Next, Weight Matrix algorithm is proposed to combine Level2 data and produces prediction for unlabeled observations. Experimental results on CLEF2009 medical image database demonstrate the benefit of our model in comparison with several existing ensemble learning models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Duin, R.P.W.: The Combining Classifier: To Train or Not to Train? In: Proceedings of the 16th International Conference on Pattern Recognition, vol. 2, pp. 765–770 (2002)

    Google Scholar 

  2. Izenman, A.J.: Modern Multivariate Statistical Techniques, ch. 14. Springer, New York (2008)

    Google Scholar 

  3. Kuncheva, L.I., Bezdek, J.C., Duin, R.P.W.: Decision Templates for Multi Classifier Fusion: An Experimental Comparison. Pattern Recognition 34(2) 299–314 (2001)

    Google Scholar 

  4. Kuncheva, L.I.: A theoretical Study on Six Classifier Fusion Strategies. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(2) (2002)

    Google Scholar 

  5. Ting, K.M., Witten, I.H.: Issues in Stacked Generation. Journal of Artificial In Intelligence Research 10, 271–289 (1999)

    Google Scholar 

  6. Kittler, J., Hatef, M., Duin, R.P.W., Matas, J.: On Combining Classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(3) (1998)

    Google Scholar 

  7. Merz, C.: Using Correspondence Analysis to Combine Classifiers. Machine Learning 36, 33–58 (1999)

    Article  Google Scholar 

  8. Todorovski, L., Džeroski, S.: Combining Classifiers with Meta Decision Trees. Machine Learning 50, 223–249 (2003)

    Article  MATH  Google Scholar 

  9. Džeroski, S., Ženko, B.: Is Combining Classifiers with Stacking Better than Selecting the Best One? Machine Learning 54, 255–273 (2004)

    Article  MATH  Google Scholar 

  10. Benediktsson, J.A., Kanellopoulos, I.: Classification of Multisource and Hyperspectral Data Based on Decision Fusion. IEEE Transactions on Geoscience and Remote Sensing 37(3) (May 1999)

    Google Scholar 

  11. Lepisto, L., Kunttu, I., Autio, J., Visa, A.: Classification of Non-Homogeneous Texture Images by Combining Classifier. In: Proceedings International Conference on Image Processing, vol. 1, pp. 981–984 (2003)

    Google Scholar 

  12. Sen, M.U., Erdogan, H.: Linear classifier combination and selection using group sparse regularization and hinge loss. Pattern Recognition Letters 34, 265–274 (2013)

    Article  Google Scholar 

  13. Chen, Y.-S., Hung, Y.-P., Yen, T.-F., Fuh, C.-S.: Fast and versatile algorithm for nearest neighbor search based on a lower bound tree. Pattern Recognition 40, 360–375 (2007)

    Article  MATH  Google Scholar 

  14. Seeward, A.K.: How to Make Stacking Better and Faster While Also Taking Care of an Unknown Weakness? In: Proceedings of the Nineteenth International Conference on Machine Learning, pp. 554–561 (2002)

    Google Scholar 

  15. Ko, B.C., Kim, S.H., Nam, J.Y.: X-ray Image Classification Using Random Forests with Local Wavelet-Based CS-Local Binary Pattern. J. Digital Imaging 24, 1141–1151 (2011)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tien Thanh Nguyen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Nguyen, T.T., Liew, A.WC., Tran, M.T., Nguyen, T.T.T., Nguyen, M.P. (2014). Fusion of Classifiers Based on a Novel 2-Stage Model. In: Wang, X., Pedrycz, W., Chan, P., He, Q. (eds) Machine Learning and Cybernetics. ICMLC 2014. Communications in Computer and Information Science, vol 481. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-45652-1_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-45652-1_7

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-662-45651-4

  • Online ISBN: 978-3-662-45652-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics