Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Obtaining Pareto Front in Instance Selection with Ensembles and Populations

  • Conference paper
  • First Online:
Artificial Intelligence and Soft Computing (ICAISC 2018)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10841))

Included in the following conference series:

Abstract

Collective computational intelligence can be used in several ways, for example as taking the decision together by some form of a bagging ensemble or as finding the solutions by multi-objective evolutionary algorithms. In this paper we examine and compare the application of the two approaches to instance selection for creating the Pareto front of the selected subsets, where the two objectives are classification accuracy and data size reduction. As the bagging ensemble members we use DROP5 algorithms. The evolutionary algorithm is based on NSGA-II. The findings are that the evolutionary approach is faster (contrary to the popular belief) and usually provides better quality solutions, with some exceptions, were the outcome of the DROP5 ensemble is better.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Kordos, M.: Data selection for neural networks. Schedae Informaticae 25, 153–164 (2017)

    Google Scholar 

  2. Arnaiz-González, Á., Blachnik, M., Kordos, M., García-Osorio, C.: Fusion of instance selection methods in regression tasks. Inf. Fusion 30, 69–79 (2016)

    Article  Google Scholar 

  3. Blachnik, M.: Ensembles of instance selection methods based on feature subset. IEEE Proc. Comput. Sci. 35, 388–396 (2014)

    Article  Google Scholar 

  4. Deb, K.: Multi-Objective Optimization using Evolutionary Algorithms. Wiley, Hoboken (2001)

    MATH  Google Scholar 

  5. Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley, Hoboken (2004)

    Book  Google Scholar 

  6. Tomek, I.: An experiment with the edited nearest-neighbor rule. IEEE Trans. Syst. Man Cybern. 6, 448–452 (1976)

    MathSciNet  MATH  Google Scholar 

  7. Sebban, M., et al.: Stopping criterion for boosting based data reduction techniques: From binary to multiclass problem. J. Mach. Learn. Res. 3, 863–885 (2002)

    MathSciNet  MATH  Google Scholar 

  8. Garcia-Pedrajas, N.: Constructing ensembles of classifiers by means of weighted instance selection. IEEE Trans. Neural Netw. 20, 258–277 (2009)

    Article  Google Scholar 

  9. Blachnik, M., Kordos, M.: Bagging of instance selection algorithms. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2014. LNCS (LNAI), vol. 8468, pp. 40–51. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-07176-3_4

    Chapter  Google Scholar 

  10. García-Pedrajas, N., De Haro-García, A.: Boosting instance selection algorithms. Knowl.-Based Syst. 67, 342–360 (2014)

    Article  Google Scholar 

  11. Wilson, D.R., Martinez, T.R.: Reduction techniques for instance-based learning algorithms. Mach. Learn. 38, 257–286 (2000)

    Article  Google Scholar 

  12. Olvera-López, A., Carrasco-Ochoa, J., Martínez-Trinidad, F., Kittler, J.: A review of instance selection methods. Artif. Intell. Rev. 34(2), 133–143 (2010)

    Article  Google Scholar 

  13. Garcia, S., Derrac, J., Cano, J.R., Herrera, F.: Prototype selection for nearest neighbor classification: Taxonomy and empirical study. IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 417–435 (2012)

    Article  Google Scholar 

  14. Goldberg, D.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison Wesley, Boston (1989)

    MATH  Google Scholar 

  15. Lobo, F.G., Lima, C.F., Michalewicz, Z.: Parameter Setting in Evolutionary Algorithms. Studies in Computational Intelligence, vol. 54. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-69432-8

    Book  MATH  Google Scholar 

  16. Konak, A., Coit, D., Smith, A.: Multi-objective optimization using genetic algorithms: A tutorial. Reliab. Eng. Syst. Safety 91, 992–1007 (2006)

    Article  Google Scholar 

  17. Antonelli, M., Ducange, P., Marcelloni, F.: Genetic training instance selection in multiobjective evolutionary fuzzy systems: A coevolutionary approach. IEEE Trans. Fuzzy Syst. 20(2), 276–290 (2012)

    Article  Google Scholar 

  18. Tsaia, C.-F., Eberleb, W., Chu, C.-Y.: Genetic algorithms in feature and instance selection. Knowl.-Based Syst. 39, 240–247 (2013)

    Article  Google Scholar 

  19. Cano, J.R., Herrera, F., Lozano, M.: Using evolutionary algorithms as instance selection for data reduction in KDD: An experimental study. IEEE Trans. Evol. Comput. 7(6), 561–575 (2003)

    Article  Google Scholar 

  20. Cano, J.R., Herrera, F., Lozano, M.: Instance selection using evolutionary algorithms: an experimental study. In: Pal, N.R., Jain, L. (eds.) Advanced Information and Knowledge Processing, pp. 127–152. Springer, London (2004). https://doi.org/10.1007/1-84628-183-0_5

    Chapter  Google Scholar 

  21. Derrac, J., et al.: Enhancing evolutionary instance selection algorithms by means of fuzzy rough set based feature selection. Inf. Sci. 186, 73–92 (2012)

    Article  Google Scholar 

  22. Kordos, M.: Optimization of evolutionary instance selection. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2017. LNCS (LNAI), vol. 10245, pp. 359–369. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-59063-9_32

    Chapter  Google Scholar 

  23. Łapa, K., Cpałka, K., Hayashi, Y.: Hybrid initialization in the process of evolutionary learning. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2017. LNCS (LNAI), vol. 10245, pp. 380–393. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-59063-9_34

    Chapter  Google Scholar 

  24. Horoba, C., Numann, F.: Benefits and drawbacks for the use of e-dominance in evolutionary multi-objective optimization. In: Genetic and Evolutionary Computation Conference. ACM Press, pp. 641–680 (2008)

    Google Scholar 

  25. Alcala-Fdez, J., et al.: KEEL Data-Mining Software Tool and Data Set Repository http://sci2s.ugr.es/keel/datasets.php (2017)

  26. Arnaiz-González, Á., Díez-Pastor, J.F., Rodríguez, J.J., García-Osorio, C.: Instance selection for regression: Adapting DROP. Neurocomputing 201, 66–81 (2016)

    Article  Google Scholar 

  27. Kordos, M., Blachnik, M., Perzyk, M., Kozłowski, J., Bystrzycki, O., Gródek, M., Byrdziak, A., Motyka, Z.: A hybrid system with regression trees in steel-making process. In: Corchado, E., Kurzyński, M., Woźniak, M. (eds.) HAIS 2011. LNCS (LNAI), vol. 6678, pp. 222–230. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-21219-2_29

    Chapter  Google Scholar 

  28. Kordos, M., Duch, W.: Variable step search algorithm for MLP training. In: The 8th IASTED International Conference on Artificial Intelligence and Soft Computing, Marbella, pp. 215–220, September 2004

    Google Scholar 

Download references

Acknowledgments

This work was supported by the NCN (Polish National Science Center) grant “Evolutionary Methods in Data Selection” No. 2017/01/X/ST6/00202.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mirosław Kordos .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kordos, M., Wydrzyński, M., Łapa, K. (2018). Obtaining Pareto Front in Instance Selection with Ensembles and Populations. In: Rutkowski, L., Scherer, R., Korytkowski, M., Pedrycz, W., Tadeusiewicz, R., Zurada, J. (eds) Artificial Intelligence and Soft Computing. ICAISC 2018. Lecture Notes in Computer Science(), vol 10841. Springer, Cham. https://doi.org/10.1007/978-3-319-91253-0_41

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-91253-0_41

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-91252-3

  • Online ISBN: 978-3-319-91253-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics