Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Non–uniform Layered Clustering for Ensemble Classifier Generation and Optimality

  • Conference paper
Neural Information Processing. Theory and Algorithms (ICONIP 2010)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6443))

Included in the following conference series:

Abstract

In this paper we present an approach to generate ensemble of classifiers using non–uniform layered clustering. In the proposed approach the dataset is partitioned into variable number of clusters at different layers. A set of base classifiers is trained on the clusters at different layers. The decision on a pattern at each layer is obtained from the classifier trained on the nearest cluster and the decisions from the different layers are fused using majority voting to obtain the final verdict. The proposed approach provides a mechanism to obtain the optimal number of layers and clusters using a Genetic Algorithm. Clustering identifies difficult–to–classify patterns and layered non–uniform clustering approach brings in diversity among the base classifiers at different layers. The proposed method performs relatively better than the other state–of–art ensemble classifier generation methods as evidenced from the experimental results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Tang, E.K., Suganthan, P.N., Yao, X.: An analysis of diversity measures. Machine Learning 65, 247–271 (2006)

    Article  Google Scholar 

  2. Kuncheva, L.I., Whitaker, C.J.: Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Machine Learning 51(2), 181–207 (2003)

    Article  MATH  Google Scholar 

  3. Scherbart, A., Nattkemper, T.W.: The diversity of regression ensembles combining bagging and random subspace method. In: Köppen, M., Kasabov, N., Coghill, G. (eds.) ICONIP 2008. LNCS, vol. 5507, pp. 911–918. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  4. Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)

    MATH  Google Scholar 

  5. Chen, L., Kamel, M.S.: A generalized adaptive ensemble generation and aggregation approach for multiple classifiers systems. Pattern Recognition 42, 629–644 (2009)

    Article  MATH  Google Scholar 

  6. Nanni, L., Lumini, A.: Fuzzy bagging: a novel ensemble of classifiers. Pattern Recognition 39, 488–490 (2006)

    Article  MATH  Google Scholar 

  7. Schapire, R.E.: The strength of weak learnability. Machine Learning 5(2), 197–227 (1990)

    Google Scholar 

  8. Freund, Y., Schapire, R.E.: Decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  9. Muhlbaier, M.D., Topalis, A., Polikar, R.: Learn++. NC: Combining ensemble of classifiers with dynamically weighted consult-and-vote for efficient incremental learning of new classes. IEEE Trans. on Neural Networks 20(1), 152–168 (2009)

    Article  Google Scholar 

  10. Nascimento, D.S.C., Coelho, A.L.V.: Ensembling heterogeneous learning models with boosting. In: Leung, C.S., Lee, M., Chan, J.H. (eds.) ICONIP 2009. LNCS, vol. 5863, pp. 512–519. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  11. Rokach, L., Maimon, O., Lavi, I.: Space decomposition in data mining: a clustering approach. In: Int. Symp. on Methodologies for Intelligent Systems, Maebashi, Japan, pp. 24–31 (2003)

    Google Scholar 

  12. Kuncheva, L.I.: Switching between selection and fusion in combining classifiers: an experiment. IEEE Trans on Systems, Man and Cybernetics 32(2), 146–156 (2002)

    Article  Google Scholar 

  13. Nasierding, G., Tsoumakas, G., Kouzani, A.Z.: Clustering based multi-label classification for image annotation and retrieval. In: IEEE Int. Conf. on Systems, Man and Cybernetics, pp. 4514–4519 (2009)

    Google Scholar 

  14. Mulder, W.D., Schliebs, S., Boel, R., Kuiper, M.: Initialization dependence of clustering algorithms. In: Köppen, M., Kasabov, N., Coghill, G. (eds.) ICONIP 2008. LNCS, vol. 5507, pp. 615–622. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  15. UCI Machine Learning Database, http://archive.ics.uci.edu/ml/

  16. Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.H.: The WEKA data mining software: an update. SIGKDD Explorations 11(1) (2009)

    Google Scholar 

  17. LIBSVM, http://www.csie.ntu.edu.tw/~cjlin/libsvm/

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Rahman, A., Verma, B., Yao, X. (2010). Non–uniform Layered Clustering for Ensemble Classifier Generation and Optimality. In: Wong, K.W., Mendis, B.S.U., Bouzerdoum, A. (eds) Neural Information Processing. Theory and Algorithms. ICONIP 2010. Lecture Notes in Computer Science, vol 6443. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-17537-4_67

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-17537-4_67

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-17536-7

  • Online ISBN: 978-3-642-17537-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics