Abstract
Bagging predictors is a method for generating multiple versions of a predictor and using these to get an aggregated predictor. The aggregation averages over the versions when predicting a numerical outcome and does a plurality vote when predicting a class. The multiple versions are formed by making bootstrap replicates of the learning set and using these as new learning sets. Tests on real and simulated data sets using classification and regression trees and subset selection in linear regression show that bagging can give substantial gains in accuracy. The vital element is the instability of the prediction method. If perturbing the learning set can cause significant changes in the predictor constructed, then bagging can improve accuracy.
Article PDF
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
References
Belsley, D., Kuh, E., & Welsch, R. (1980). “Regression Diagnostics”, John Wiley and Sons.
Breiman, L. (1994). Heuristics of instability in model selection, Technical Report, Statistics Department, University of California at Berkeley (to appear, Annals of Statistics).
Breiman, L., Friedman, J., Olshen, R., & Stone, C. (1984). “Classification and Regression Trees”, Wadsworth.
Breiman, L. & Friedman, J. (1985). Estimating optimal transformations in multiple regression and correlation (with discussion), Journal of the American Statistical Association, 80, 580–619.
Breiman, L. & Spector, P (1992). Submodel Selection and Evaluation in Regression-the X-Random Case, International Statistical Review, 3, 291–319
Buntine, W. (1991). “Learning classification trees”, Artificial Intelligence Frontiers in Statistics, ed D.J. Hand, Chapman and Hall, London, 182–201.
Dietterich, T.G. & Bakiri, G. (1991). Error-correcting output codes: A general method for improving multiclass inductive learning programs, Proceedings of the Ninth National Conference on Artificial Intelligence (AAAI-91), Anaheim, CA: AAAI Press.
Efron, B., & Tibshirani, R. (1993). “An Introduction to the Bootstrap”. Chapman and Hall.
Friedman, J. (1991). Multivariate adaptive regression splines (with discussion), Annals of Statistics, 19, 1–141.
Heath, D., Kasif, S., & Salzberg, S. (1993). k-dt: a multi-tree learning method. Proceedings of the Second International Workshop on Multistrategy Learning, 1002–1007, Chambery, France, Morgan Kaufman.
Kwok, S., & Carter, C. (1990). Multiple decision trees, Uncertainty in Artificial Intelligence 4, ed. Shachter, R., Levitt, T., Kanal, L., and Lemmer, J., North-Holland, 327–335.
Michie, D., Spiegelhalter, D.J. & Taylor, C.C. (1994). Machine Learning, Neural and Statistical Classification. Ellis Horwood Limited.
Olshen, R., Gilpin, A., Henning, H., LeWinter, M., Collins, D., & Ross, J. (1985). Twelve-month prognosis following myocardial infarction: Classification trees, logistic regression, and stepwise linear discrimination, Proceedings of the Berkeley conference in honor of Jerzy Neyman and Jack Kiefer, L. Le Cam;R. Olshen, (Ed), Wadsworth, 245–267.
Smith, J., Everhart, J., Dickson, W., Knowler, W., & Johannes, R. (1988). Using the ADAP learning algorithm to forecast the onset of diabetes mellitus. In Proceedings of the Symposium on Computer Applications and Medical Care 261–265. IEEE Computer Society Press.
Sigillito, V. G., Wing, S. P., Hutton, L. V., & Baker, K. B. (1989). Classification of radar returns from the ionosphere using neural networks. Johns Hopkins APL Technical Digest, 10, 262–266.
Wolberg, W. & Mangasarian, O (1990). Multisurface method of pattern separation for medical diagnosis applied to breast cytology, Proceedings of the National Academy of Sciences, U.S.A., Volume 87, December 1990, pp 9193–9196.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Breiman, L. Bagging predictors. Mach Learn 24, 123–140 (1996). https://doi.org/10.1007/BF00058655
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.1007/BF00058655