Abstract
The ability of ensemble models to retain the bias of their learners while decreasing their individual variance has long made them quite attractive in a number of classification and regression problems. Moreover, when trees are used as learners, the relative simplicity of the resulting models has led to a renewed interest on them on Big Data problems. In this work we will study the application of Random Forest Regression (RFR) and Gradient Boosted Regression (GBR) to global and local wind energy prediction problems working with their high quality implementations in the Scikit–learn Python libraries. Besides a complete exploration of the RFR and GBR application to wind energy prediction, we will show experimentally that both ensemble methods can improve on SVR for individual wind farm energy prediction and that at least GBR is also competitive when the interest lies in predicting wind energy in a much larger geographical scale.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Breiman, L., Friedman, J., Olshen, R., Stone, C.: Classification and Regression Trees. Wadsworth and Brooks, Monterey (1984)
Chang, C., Lin, C.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. 2, 27:1–27:27 (2011). http://www.csie.ntu.edu.tw/cjlin/libsvm
ECMWF: European Center for Medium-range Weather Forecasts (2005). http://www.ecmwf.int/
Friedman, J.: Greedy function approximation: a gradient boosting machine. Ann. Stat. 29, 1189–1232 (2001)
Friedman, J.: Stochastic gradient boosting. Comput. Stat. Data Anal. 38, 367–378 (2002)
GFS: NOAA Global Forecast System (2014). http://www.emc.ncep.noaa.gov/
Han, S., Liu, Y., Yan, J.: Neural network ensemble method study for wind power prediction. In: 2011 Asia-Pacific Power and Energy Engineering Conference (APPEEC), pp. 1–4 (2011)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer, New York (2009)
Heinermann, J., Kramer, O.: Precise wind power prediction with SVM ensemble regression. In: Wermter, S., Weber, C., Duch, W., Honkela, T., Koprinkova-Hristova, P., Magg, S., Palm, G., Villa, A.E.P. (eds.) ICANN 2014. LNCS, vol. 8681, pp. 797–804. Springer, Heidelberg (2014)
Kaggle-American Meteorological Society: Kaggle-AMS 2013–2014 Solar Energy Prediction Contest (2013). www.kaggle.com/c/ams-2014-solar-energy-prediction-contest
Mason, L., Baxter, J., Bartlett, P., Frean, M.: Boosting algorithms as gradient descent in function space. In: NIPS (1999)
Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)
Acknowledgments
With partial support from Spain’s grants TIN2013-42351-P (MINECO) and S2013/ICE-2845 CASI-CAM-CM (Comunidad de Madrid), and the UAM–ADIC Chair for Data Science and Machine Learning. The first author is kindly supported by the UAM–ADIC Chair for Data Science and Machine Learning and the second author by the FPU-MEC grant AP-2012-5163. We gratefully acknowledge the use of the facilities of Centro de Computación Científica (CCC) at UAM and thank Red Eléctrica de España for kindly supplying wind energy production data.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Alonso, Á., Torres, A., Dorronsoro, J.R. (2015). Random Forests and Gradient Boosting for Wind Energy Prediction. In: Onieva, E., Santos, I., Osaba, E., Quintián, H., Corchado, E. (eds) Hybrid Artificial Intelligent Systems. HAIS 2015. Lecture Notes in Computer Science(), vol 9121. Springer, Cham. https://doi.org/10.1007/978-3-319-19644-2_3
Download citation
DOI: https://doi.org/10.1007/978-3-319-19644-2_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-19643-5
Online ISBN: 978-3-319-19644-2
eBook Packages: Computer ScienceComputer Science (R0)