Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Constructive Meta-level Feature Selection Method Based on Method Repositories

  • Conference paper
Advances in Knowledge Discovery and Data Mining (PAKDD 2006)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3918))

Included in the following conference series:

  • 3063 Accesses

Abstract

Feature selection is one of key issues related with data pre-processing of classification task in a data mining process. Although many efforts have been done to improve typical feature selection algorithms (FSAs), such as filter methods and wrapper methods, it is hard for just one FSA to manage its performances to various datasets. To above problems, we propose another way to support feature selection procedure, constructing proper FSAs to each given dataset. Here is discussed constructive meta-level feature selection that re-constructs proper FSAs with a method repository every given datasets, de-composing representative FSAs into methods. After implementing the constructive meta-level feature selection system, we show how constructive meta-level feature selection goes well with 32 UCI common data sets, comparing with typical FSAs on their accuracies. As the result, our system shows the highest performance on accuracies and the availability to construct a proper FSA to each given data set automatically.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Komori, M., Abe, H., Yamaguchi, T.: A new feature selection method based on dynamic inclemental extension of seed features. In: Proceedings of Knowledge-Based Software Engineering, pp. 291–296 (2002)

    Google Scholar 

  2. John, G.H., Kohavi, R., Pfleger, K.: Irrelevant features and the subset selection problem. In: International Conference on Machine Learning, pp. 121–129 (1994)

    Google Scholar 

  3. John, G.H.: Enhancements to the data mining process. PhD thesis, Computer Science Department, Stanford University (1997)

    Google Scholar 

  4. Liu, H., Motoda, H.: Feature Selection for Knowledge Discovery and Data Mining. Kluwer Academic Publishers, Dordrecht (1998)

    Book  MATH  Google Scholar 

  5. Hall, M.A.: Benchmarking attribute selection techniques for data mining. Technical Report Working Paper 00/10, Department of Computer Science, University of Waikato (2000)

    Google Scholar 

  6. Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artificial Intelligence 97, 273–324 (1997)

    Article  MATH  Google Scholar 

  7. Kira, K., Rendell, L.: A practical approach to feature selection. In: Sleeman, D., Edwards, P. (eds.) Proceedings of the Ninth International Conference on Machine Learning, pp. 249–256 (1992)

    Google Scholar 

  8. Kononenko, I.: Estimating attributes: Analysis and extensions of relief. In: Proceedings of the 1994 European Conference on Machine Learning, pp. 171–182 (1994)

    Google Scholar 

  9. Alumualim, H., Dietterich, T.G.: Learning boolean concepts in the presence of many irrelevant features. Artificial Intelligence 69, 279–305 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  10. Hall, M.: Correlation-based Feature Selection for Machine Learning. PhD thesis, Department of Computer Science, University of Waikato (1998)

    Google Scholar 

  11. Quinlan, J.R.: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1992)

    Google Scholar 

  12. Langley, P.: Selection of relevant features in machine learning. In: Proceedings of the AAAI Fall Symposium on Relevance (1994)

    Google Scholar 

  13. Molina, L.C., Beranche, L., Nebot, A.: Feature selection algorithms: A survey and experimental evaluation. In: Proceedings of the 2002 Internatiolan Conference on Data Mining, pp. 306–313 (2002)

    Google Scholar 

  14. Witten, I., Frank, E.: Data Mining: Practical machine learning tools and techniques with Java implementations. Morgan Kaufmann, San Francisco (2000)

    Google Scholar 

  15. Mierswa, I., Klinkenberg, R., Fischer, S., Ritthoff, O.: A Flexible Platform for Knowledge Discovery Experiments: YALE – Yet Another Learning Environment. In: LLWA 2003 - Tagungsband der GI-Workshop-Woche Lernen - Lehren – Wissen - Adaptivität (2003)

    Google Scholar 

  16. Breiman, L.: Bagging predictors. Machine Learning 24, 123–140 (1996)

    MATH  Google Scholar 

  17. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. In: Proceedings the Second European Conference on Computational Learning Theory (1995)

    Google Scholar 

  18. Wolpert, D.: Stacked generalization. Neural Network 5, 241–260 (1992)

    Article  Google Scholar 

  19. Gama, J., Brazdil, P.: Cascade generalization. Machine Learning 41, 315–343 (2000)

    Article  MATH  Google Scholar 

  20. METAL (2002), http://www.metal-kdd.org/

  21. Bernstein, A., Provost, F.: An intelligent assistant for knowledge discovery process. In: IJCAI 2001 Workshop on Wrappers for Performance Enhancement in KDD (2001)

    Google Scholar 

  22. Abe, H., Yamaguchi, T.: Constructive meta-learning with machine learning method repositories. In: Proceedings of the seventeenth International Conference on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems, pp. 502–511 (2004)

    Google Scholar 

  23. Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html

  24. Vafaie, H., Jong, K.D.: Genetic algorithms as a tool for feature selection in machine learning. In: Proceedings of the fourth International Conference on Tools with Artificial Intelligence, pp. 200–204 (1992)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Abe, H., Yamaguchi, T. (2006). Constructive Meta-level Feature Selection Method Based on Method Repositories. In: Ng, WK., Kitsuregawa, M., Li, J., Chang, K. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2006. Lecture Notes in Computer Science(), vol 3918. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11731139_11

Download citation

  • DOI: https://doi.org/10.1007/11731139_11

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-33206-0

  • Online ISBN: 978-3-540-33207-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics