Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Iterative bias reduction: a comparative study

  • Published:
Statistics and Computing Aims and scope Submit manuscript

Abstract

Multivariate nonparametric smoothers, such as kernel based smoothers and thin plate splines smoothers, are adversely impacted by the sparseness of data in high dimension, also known as the curse of dimensionality. Adaptive smoothers, that can exploit the underlying smoothness of the regression function, may partially mitigate this effect. This paper presents a comparative simulation study of a novel adaptive smoother (IBR) with competing multivariate smoothers available as package or function within the R language and environment for statistical computing. Comparison between the methods are made on simulated datasets of moderate size, from 50 to 200 observations, with two, five or 10 potential explanatory variables, and on a real dataset. The results show that the good asymptotic properties of IBR are complemented by a very good behavior on moderate sized datasets, results which are similar to those obtained with Duchon low rank splines.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  • Akaike, H.: Information theory and an extension of the maximum likelihood principle. In: Petrov, B.N., Csaki, B.F. (eds.) Second International Symposium on Information Theory, pp. 267–281. Academiai Kiado, Budapest (1973)

    Google Scholar 

  • Breiman, L.: Bagging predictors. Mach. Learn. 24, 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  • Breiman, L.: Using adaptive bagging to Debais regressions. Tech. Rep. 547, Department of Statistics, UC Berkeley (1999)

  • Breiman, L., Freiman, J., Olshen, R., Stone, C.: Classification and Regression Trees, 4th edn. CRC Press, Boca Raton (1984)

    MATH  Google Scholar 

  • Bühlmann, P., Yu, B.: Boosting with the l 2 loss: regression and classification. J. Am. Stat. Assoc. 98, 324–339 (2003)

    Article  MATH  Google Scholar 

  • Bühlmann, P., Yu, B.: Sparse boosting. J. Mach. Learn. Res. 7, 1001–1024 (2006)

    MathSciNet  MATH  Google Scholar 

  • Buja, A., Hastie, T., Tibshirani, R.: Linear smoothers and additive models. Ann. Stat. 17, 453–510 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  • Cornillon, P.A., Hengartner, N., Matzner-Løber, E.: Iterative bias reduction multivariate smoothing in R: the IBR package (2011a). arXiv:1105.3605v1

  • Cornillon, P.A., Hengartner, N., Matzner-Løber, E.: Recursive bias estimation for multivariate regression (2011b). arXiv:1105.3430v2

  • Craven, P., Wahba, G.: Smoothing noisy data with spline functions: estimating the correct degree of smoothing by the method of generalized cross-validation. Numer. Math. 31, 377–403 (1979)

    Article  MathSciNet  MATH  Google Scholar 

  • Di Marzio, M., Taylor, C.: On boosting kernel regression. J. Stat. Plan. Inference 138, 2483–2498 (2008)

    Article  MATH  Google Scholar 

  • Duchon, J.: Splines minimizing rotation-invariant semi-norms in Sobolev spaces. In: Shemp, W., Zeller, K. (eds.) Construction Theory of Functions of Several Variables, pp. 85–100. Springer, Berlin (1977)

    Chapter  Google Scholar 

  • Eubank, R.: Spline Smoothing and Nonparametric Regression. Marcel Dekker, New York (1988)

    MATH  Google Scholar 

  • Fan, J., Gijbels, I.: Local Polynomial Modeling and Its Application, Theory and Methodologies. Chapman & Hall, New York (1996)

    Google Scholar 

  • Friedman, J.: Multivariate adaptive regression splines. Ann. Stat. 19, 337–407 (1991)

    Google Scholar 

  • Friedman, J.: Greedy function approximation: A gradient boosting machine. Ann. Stat. 28, 1189–1232 (2001)

    Article  Google Scholar 

  • Friedman, J., Stuetzle, W.: Projection pursuit regression. J. Am. Stat. Assoc. 76, 817–823 (1981)

    Article  MathSciNet  Google Scholar 

  • Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: a statistical view of boosting. Ann. Stat. 28, 337–407 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  • Gu, C.: Smoothing Spline ANOVA Models. Springer, Berlin (2002)

    Book  MATH  Google Scholar 

  • Gyorfi, L., Kohler, M., Krzyzak, A., Walk, H.: A Distribution-Free Theory of Nonparametric Regression. Springer, Berlin (2002)

    Book  Google Scholar 

  • Hastie, T.J., Tibshirani, R.J.: Generalized Additive Models. Chapman & Hall, New York (1995)

    Google Scholar 

  • Hurvich, C., Simonoff, G., Tsai, C.L.: Smoothing parameter selection in nonparametric regression using and improved Akaike information criterion. J. R. Stat. Soc. B 60, 271–294 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  • Lepski, O.: Asymptotically minimax adaptive estimation. I: Upper bounds. Opitmally adaptive estimates. Theory Probab. Appl. 37, 682–697 (1991)

    Article  Google Scholar 

  • Li, K.C.: Asymptotic optimality for C p , C L , cross-validation and generalized cross-validation: discrete index set. Ann. Stat. 15, 958–975 (1987)

    Article  MATH  Google Scholar 

  • Ridgeway, G.: Additive logistic regression: a statistical view of boosting: discussion. Ann. Stat. 28, 393–400 (2000)

    Google Scholar 

  • Schwarz, G.: Estimating the dimension of a model. Ann. Stat. 6, 461–464 (1978)

    Article  MATH  Google Scholar 

  • Simonoff, J.S.: Smoothing Methods in Statistics. Springer, New York (1996)

    Book  MATH  Google Scholar 

  • Tsybakov, A.: Introduction to Nonparametric Estimation. Springer, Berlin (2009)

    Book  MATH  Google Scholar 

  • Tukey, J.W.: Explanatory Data Analysis. Addison-Wesley, Reading (1977)

    Google Scholar 

  • Wood, S.N.: Thin plate regression splines. J. R. Stat. Soc. B 65, 95–114 (2003)

    Article  MATH  Google Scholar 

  • Wood, S.N.: Stable and efficient multiple smoothing parameter estimation for generalized additive models. J. Am. Stat. Assoc. 99, 673–686 (2004)

    Article  MATH  Google Scholar 

  • Yang, Y.: Combining different procedures for adaptive regression. J. Multivar. Anal. 74, 135–161 (2000)

    Article  MATH  Google Scholar 

Download references

Acknowledgements

We would like to thank the associate editor and the referees for very valuable remarks and for pointing out to us the work of Duchon (1977).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to E. Matzner-Løber.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Cornillon, PA., Hengartner, N., Jegou, N. et al. Iterative bias reduction: a comparative study. Stat Comput 23, 777–791 (2013). https://doi.org/10.1007/s11222-012-9346-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11222-012-9346-4

Keywords