Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Evaluation of Decision Tree Pruning with Subadditive Penalties

  • Conference paper
Intelligent Data Engineering and Automated Learning – IDEAL 2006 (IDEAL 2006)

Abstract

Recent work on decision tree pruning[1] has brought to the attention of the machine learning community the fact that, in classification problems, the use of subadditive penalties in cost-complexity pruning has a stronger theoretical basis than the usual additive penalty terms. We implement cost-complexity pruning algorithms with general size-dependent penalties to confirm the results of[1] . Namely, that the family of pruned subtrees selected by pruning with a subadditive penalty of increasing strength is a subset of the family selected using additive penalties. Consequently, this family of pruned trees is unique, it is nested and it can be computed efficiently. However, in spite of the better theoretical grounding of cost-complexity pruning with subadditive penalties, we found no systematic improvements in the generalization performance of the final classification tree selected by cross-validation using subadditive penalties instead of the commonly used additive ones.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Scott, C.: Tree pruning with subadditive penalties. IEEE Transactions on Signal Processing 53(12), 4518–4525 (2005)

    Article  MathSciNet  Google Scholar 

  2. Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Chapman & Hall, New York (1984)

    MATH  Google Scholar 

  3. Quinlan, J.R.: C4.5 programs for machine learning. Morgan Kaufmann, San Francisco (1993)

    Google Scholar 

  4. Quinlan, J.R.: Simplifying decision trees. Int. J. Man-Mach. Stud. 27(3), 221–234 (1987)

    Article  Google Scholar 

  5. Esposito, F., Malerba, D., Semeraro, G., Kay, J.: A comparative analysis of methods for pruning decision trees. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(5), 476–491 (1997)

    Article  Google Scholar 

  6. Mansour, Y., McAllester, D.A.: Generalization bounds for decision trees. In: COLT 2000: Proceedings of the Thirteenth Annual Conference on Computational Learning Theory, pp. 69–74. Morgan Kaufmann Publishers Inc, San Francisco (2000)

    Google Scholar 

  7. Nobel, A.: Analysis of a complexity-based pruning scheme for classification trees. IEEE Transactions on Information Theory 48(8), 2362–2368 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  8. Scott, C., Nowak, R.: Dyadic classification trees via structural risk minimization. Advances in Neural Information Processing Systems 15 (2003)

    Google Scholar 

  9. Scott, C., Nowak, R.: Minimax-optimal classification with dyadic decision trees. IEEE Transactions on Information Theory 52(4), 1335–1353 (2006)

    Article  MathSciNet  Google Scholar 

  10. Mingers, J.: An empirical comparison of pruning methods for decision tree induction. Machine Learning 4(2), 227–243 (1989)

    Article  Google Scholar 

  11. Blake, C.L., Merz, C.J.: UCI repository of machine learning databases (1998)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

García-Moratilla, S., Martínez-Muñoz, G., Suárez, A. (2006). Evaluation of Decision Tree Pruning with Subadditive Penalties. In: Corchado, E., Yin, H., Botti, V., Fyfe, C. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2006. IDEAL 2006. Lecture Notes in Computer Science, vol 4224. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11875581_119

Download citation

  • DOI: https://doi.org/10.1007/11875581_119

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-45485-4

  • Online ISBN: 978-3-540-45487-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics