Machine Learning Approach to Predict Building Thermal Load Considering Feature Variable Dimensions: An Office Building Case Study
Abstract
:1. Introduction
1.1. Literature Study
1.2. Motivations and Contributions
2. Methodology
2.1. Seed Model and Energy Data Source Description
2.1.1. Seed Model Description
2.1.2. Energy Data Source Description
2.1.3. Input Feature Selection
2.2. Selected Data-Driven Algorithm Description
2.2.1. LightGBM
2.2.2. Random Forest (RF)
2.2.3. Long Short-Term Memory (LSTM)
2.3. Data-Driven Model Development Process
2.4. Prediction Performance Indices
3. Results and Discussion
3.1. Scenario 1: Six Input Features
3.2. Scenario 2: Nine Input Features
3.3. Scenario 3: Fifteen Input Features
3.4. Discussion
4. Conclusions
- (1)
- LightGBM is the most accurate and fastest prediction model. In the best scenario, the CVRMSE and R2 of LightGBM are 5.25% and 0.99, respectively. Compared with the results of the other two algorithms and those in the existing literature, LightGBM is the most promising and best algorithm for building thermal load prediction.
- (2)
- By training with the large amount of energy data generated by physics-based tools or on-site data, a data-driven model is able to represent a physics-based tool with comparable accuracy.
- (3)
- The dimensions of the input features influence the prediction performance. Compared with a scenario using only weather information, the CVRMSE can be further improved when physical and operational information are considered. Although better accuracy is achieved with bigger dimensions of input features, it impacts the computational speed. Therefore, there will always be a tradeoff between the prediction accuracy demand and prediction speed tolerance.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Nomenclature
ANN | Artificial neural networks |
AR | Aspect ratio |
CTS | Cooling temperature set point |
CV | Coefficient of variation |
CVRMSE | Coefficient of variation of root mean square error |
DB | Dry bulb temperature |
EL | Electrical load |
ELM | Extreme learning machine |
FAV | Fresh air volume |
FH | Floor height |
GHR | Global horizontal radiation |
HOD | Hour of the day |
HTS | Heating temperature set point |
IM | Internal mass |
MAPE | Mean absolute percentage error |
MARS | Multivariate adaptive regression splines |
MLP | Multilayer layer perceptron |
NARM | Nonlinear autoregressive model |
RF | Random forest |
RH | Relative humidity |
RNN | Recurrent neural networks |
RW | R-value of wall |
SC | Shape coefficient |
SVM | Support vector machine |
SVR | Support vector regression |
WS | Wind speed |
WWR | Window to wall ratio |
XGBoost | Extreme gradient boosting |
Appendix A
Hyperparameter | Description | Grid Searching Range | Selected |
---|---|---|---|
□activation | Activation functions | Sigmoid; Tanh; Relu | Relu |
□optimizer | Optimization algorithms | Adam; RMSprop; Adagrad; SGD | RMSprop |
□loss | Loss function | Mean Square Error; Mean Absolute Error; Mean Squared Logarithmic Error | Mean Square Error |
□units | Number of memory cells | range (20, 200, 20) | 60 |
□epochs | Number of epochs | range (20, 200, 20) | 100 |
□batch_size | Number of batch size | [20, 32, 60, 100, 500] | 60 |
Hyper-Parameters | Description | Grid Searching Range | Selected |
---|---|---|---|
learning_rate | Step size shrinkage used in the update to prevent overfitting | range (0.02, 0.12, 0.02) | 0.1 |
n_estimators | Number of estimators | range (50, 400, 50) | 350 |
max_depth | The depth of tree model | range (3, 10, 1) | 9 |
num_leaves | the main parameter to control the complexity of the tree model | range (5, 500, 5) | 65 |
max_bin | the maximum number of bins stored in feature | range (5, 256, 10) | 95 |
Hyperparameters | Description | Grid Searching Range | Selected |
---|---|---|---|
n_estimators | The number of trees in the forest | range (10, 110, 10) | 20 |
max_depth | The maximum depth of the tree | range (2, 20, 2) | 8 |
min_sample_split | The minimum number of samples required to split an internal node | range (1, 11, 1) | 6 |
max_features | The number of features to consider when looking for the best split | [‘auto’, ‘sqrt’, ‘log2’] | ‘auto’ |
References
- Fan, C.; Liao, Y.; Zhou, G.; Zhou, X.; Ding, Y. Improving cooling load prediction reliability for HVAC system using Monte-Carlo simulation to deal with uncertainties in input variables. Energy Build. 2020, 226, 110372. [Google Scholar] [CrossRef]
- Li, W.; Gong, G.; Fan, H.; Peng, P.; Chun, L.; Fang, X. A clustering-based approach for “cross-scale” load prediction on building level in HVAC systems. Appl. Energy 2021, 282, 116223. [Google Scholar] [CrossRef]
- Jang, Y.; Byon, E.; Jahani, E.; Cetin, K. On the long-term density prediction of peak electricity load with demand side management in buildings. Energy Build. 2020, 228, 110450. [Google Scholar] [CrossRef]
- Chen, Y.; Xu, P.; Chu, Y.; Li, W.; Wu, Y.; Ni, L.; Bao, Y.; Wang, K. Short-term electrical load forecasting using the Support Vector Regression (SVR) model to calculate the demand response baseline for office buildings. Appl. Energy 2017, 195, 659–670. [Google Scholar] [CrossRef]
- Chen, Y.; Chen, Z.; Xu, P.; Li, W.; Sha, H.; Yang, Z.; Li, G.; Hu, C. Quantification of electricity flexibility in demand response: Office building case study. Energy 2019, 188, 116054. [Google Scholar] [CrossRef]
- Foucquier, A.; Robert, S.; Suard, F.; Stephan, L.; Jay, A. State of the art in building modelling and energy performances prediction: A review. Renew. Sustain. Energy Rev. 2013, 23, 272–288. [Google Scholar] [CrossRef] [Green Version]
- Luo, X.J.; Lukumon, O.O.; Anuoluwapo, O.A.; Olugbenga, O.A.; Hakeem, A.O.; Ashraf, A. Feature extraction and genetic algorithm enhanced adaptive deep neural network for energy consumption prediction in buildings. Renew. Sustain. Energy Rev. 2020, 131, 109980. [Google Scholar] [CrossRef]
- Nageler, P.; Schweiger, G.; Pichler, M.; Brandl, D.; Mach, T.; Heimrath, R.; Schranzhofer, H.; Hochenauer, C. Validation of dynamic building energy simulation tools based on a real test-box with thermally activated building systems (TABS). Energy Build. 2018, 168, 42–55. [Google Scholar] [CrossRef]
- Wang, Z.; Hong, T.; Piette, M.A. Data fusion in predicting internal heat gains for office buildings through a deep learning approach. Appl. Energy 2019, 240, 386–398. [Google Scholar] [CrossRef] [Green Version]
- Wu, J.; Wang, Y.-G.; Tian, Y.-C.; Burrage, K.; Cao, T. Support vector regression with asymmetric loss for optimal electric load forecasting. Energy 2021, 223, 119969. [Google Scholar] [CrossRef]
- Ahmad, T.; Chen, H. Nonlinear autoregressive and random forest approaches to forecasting electricity load for utility energy management systems. Sustain. Cities Soc. 2019, 45, 460–473. [Google Scholar] [CrossRef]
- Lahouar, A.; Ben Hadj Slama, J. Day-ahead load forecast using random forest and expert input selection. Energy Convers. Manag. 2015, 103, 1040–1051. [Google Scholar] [CrossRef]
- Wang, Z.; Hong, T.; Piette, M.A. Building thermal load prediction through shallow machine learning and deep learning. Appl. Energy 2020, 263, 114683. [Google Scholar] [CrossRef] [Green Version]
- Cao, L.; Li, Y.; Zhang, J.; Jiang, Y.; Han, Y.; Wei, J. Electrical load prediction of healthcare buildings through single and ensemble learning. Energy Rep. 2020, 6, 2751–2767. [Google Scholar] [CrossRef]
- Moon, J.; Park, S.; Rho, S.; Hwang, E. Robust building energy consumption forecasting using an online learning approach with R ranger. J. Build. Eng. 2022, 47, 103851. [Google Scholar] [CrossRef]
- Wang, Y.; Gan, D.; Sun, M.; Zhang, N.; Lu, Z.; Kang, C. Probabilistic individual load forecasting using pinball loss guided LSTM. Appl. Energy 2019, 235, 10–20. [Google Scholar] [CrossRef] [Green Version]
- Somu, N.; Raman, G.M.R.; Ramamritham, K. A hybrid model for building energy consumption forecasting using long short term memory networks. Appl. Energy 2020, 261, 114131. [Google Scholar] [CrossRef]
- Xu, L.; Hu, M.; Fan, C. Probabilistic electrical load forecasting for buildings using Bayesian deep neural networks. J. Build. Eng. 2022, 46, 103853. [Google Scholar] [CrossRef]
- Khwaja, A.S.; Anpalagan, A.; Naeem, M.; Venkatesh, B. Joint bagged-boosted artificial neural networks: Using ensemble machine learning to improve short-term electricity load forecasting. Electr. Power Syst. Res. 2020, 179, 106080. [Google Scholar] [CrossRef]
- Zhou, Y.; Liang, Y.; Pan, Y.; Yuan, X.; Xie, Y.; Jia, W. A Deep-Learning-Based Meta-Modeling Workflow for Thermal Load Forecasting in Buildings: Method and a Case Study. Buildings 2022, 12, 177. [Google Scholar] [CrossRef]
- Zhang, Y.; Teoh, B.K.; Wu, M.; Chen, J.; Zhang, L. Data-driven estimation of building energy consumption and GHG emissions using explainable artificial intelligence. Energy 2023, 262, 125468. [Google Scholar] [CrossRef]
- Shi, J.; Li, C.; Yan, X. Artificial intelligence for load forecasting: A stacking learning approach based on ensemble diversity regularization. Energy 2023, 262, 125295. [Google Scholar] [CrossRef]
- Lu, Y.; Meng, L. A simplified prediction model for energy use of air conditioner in residential buildings based on monitoring data from the cloud platform. Sustain. Cities Soc. 2020, 60, 102194. [Google Scholar]
- Wang, R.; Lu, S.; Li, Q. Multi-criteria comprehensive study on predictive algorithm of hourly heating energy consumption for residential buildings. Sustain. Cities Soc. 2019, 49, 101623. [Google Scholar] [CrossRef]
- Seyedzadeh, S.; Pour Rahimian, F.; Rastogi, P.; Glesk, I. Tuning machine learning models for prediction of building energy loads. Sustain. Cities Soc. 2019, 47, 101484. [Google Scholar] [CrossRef]
- Kumar, S.; Pal, S.K.; Singh, R.P. A novel method based on extreme learning machine to predict heating and cooling load through design and structural attributes. Energy Build. 2018, 176, 275–286. [Google Scholar] [CrossRef]
- Kaggle Competitions. Available online: https://www.kaggle.com/competitions (accessed on 12 January 2023).
- Butch, Q. Machine Learning with Spark-Covers XGBoost, LightGBM, Spark NLP.; Distributed Deep Learning with Keras, and More; Springer Science, Business Media New York: New York, NY, USA, 2020. [Google Scholar]
- Get Started with XGBoost. Available online: https://xgboost.readthedocs.io/en/latest/get_started.html (accessed on 12 January 2023).
- Chen, Y.; Guo, M.; Chen, Z.; Chen, Z.; Ji, Y. Physical energy and data-driven models in building energy prediction: A review. Energy Rep. 2022, 16, 2656–2671. [Google Scholar] [CrossRef]
- Dudek, G. Short-Term Load Forecasting Using Random Forests. Adv. Intell. Syst. Comput. 2015, 323, 821–828. [Google Scholar]
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Zang, H.; Xu, R.; Cheng, L.; Ding, T.; Liu, L.; Wei, Z.; Sun, G. Residential load forecasting based on LSTM fusing self-attention mechanism with pooling. Energy 2021, 229, 120682. [Google Scholar] [CrossRef]
- Zhang, L.; Wen, J. Active learning strategy for high fidelity short-term data-driven building energy forecasting. Energy Build. 2021, 244, 111026. [Google Scholar] [CrossRef]
- Hu, Y.; Cheng, X.; Wang, S.; Chen, J.; Zhao, T.; Dai, E. Times series forecasting for urban building energy consumption based on graph convolutional network. Appl. Energy 2022, 307, 118231. [Google Scholar] [CrossRef]
- Olu-Ajayi, R.; Alaka, H.; Sulaimon, I.; Sunmola, F.; Ajayi, S. Building energy consumption prediction for residential buildings using deep learning and other machine learning techniques. J. Build. Eng. 2022, 45, 103406. [Google Scholar] [CrossRef]
- Li, Y.; Tong, Z.; Tong, S.; Westerdahl, D. A data-driven interval forecasting model for building energy prediction using attention-based LSTM and fuzzy information granulation. Sustain. Cities Soc. 2022, 76, 103481. [Google Scholar] [CrossRef]
- Do, H.; Cetin, K.S. Evaluation of the causes and impact of outliers on residential building energy use prediction using inverse modeling. Build. Environ. 2018, 138, 194–206. [Google Scholar] [CrossRef]
- Guo, Y.; Wang, J.; Chen, H.; Li, G.; Liu, J.; Xu, C.; Huang, R.; Huang, Y. Machine learning-based thermal response time ahead energy demand prediction for building heating systems. Appl. Energy 2018, 221, 16–27. [Google Scholar] [CrossRef]
- Wang, Z.; Srinivasan, R.S. A review of artificial intelligence based building energy use prediction: Contrasting the capabilities of single and ensemble prediction models. Renew. Sustain. Energy Rev. 2017, 75, 796–808. [Google Scholar] [CrossRef]
- Commercial Prototype Building Models. Available online: https://www.energycodes.gov/development/commercial/prototype_models (accessed on 12 January 2023).
- Bryan, E.; Zheng, O.; Vladimir, A.F.; Igor, M. Uncertainty and sensitivity decomposition of building energy models. J. Build. Perform. Simu. 2012, 5, 171–184. [Google Scholar]
- Key-Inputs-Setting-and-Energy-Data. Available online: https://github.com/Bob05757/Key-inputs-setting-and-Energy-Data (accessed on 12 January 2023).
- Li, Q.; Meng, Q.; Cai, J.; Yoshino, H.; Mochida, A. Applying support vector machine to predict hourly cooling load in the building. Appl. Energy 2009, 86, 2249–2256. [Google Scholar] [CrossRef]
- Leung, M.C.; Tse, N.C.F.; Lai, L.L.; Chow, T.T. The use of occupancy space electrical power demand in building cooling load prediction. Energy Build. 2012, 55, 151–163. [Google Scholar] [CrossRef]
- Luo, X.J.; Oyedele, L.O.; Ajayi, A.O.; Monyei, C.G.; Akinade, O.O.; Akanbi, L.A. Development of an IoT-based big data platform for day-ahead prediction of building heating and cooling demands. Adv. Eng. Inform. 2019, 41, 100926. [Google Scholar] [CrossRef]
- Guolin, K.; Qi, M.; Thomas, F.; Taifeng, W.; Wei, C.; Weidong, M.; Ye, Q.; Liu, T.-Y. LightGBM: A Highly Efficient Gradient Boosting Decision Tree. In Proceedings of the 31st Conference on Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017. [Google Scholar]
- Distributed Machine Learning Toolkit-Big Data, Big Model, Flexibility, Efficiency. Available online: http://www.dmtk.io/ (accessed on 12 January 2023).
- Jin, R.M.; Agrawal, G. Communication and Memory Efficient Parallel Decision Tree Construction. In Proceedings of the 2003 SIAM International Conference on Data Mining, San Francisco, CA, USA, 1–3 May 2003; pp. 119–129. [Google Scholar]
- Moon, J.; Kim, Y.; Son, M.; Hwang, E. Hybrid Short-Term Load Forecasting Scheme Using Random Forest and Multilayer Perceptron. Energies 2018, 11, 3283. [Google Scholar] [CrossRef] [Green Version]
- Ahmad, M.W.; Mourshed, M.; Rezgui, Y. Trees vs Neurons: Comparison between random forest and ANN for high-resolution prediction of building energy consumption. Energy Build. 2017, 147, 77–89. [Google Scholar] [CrossRef]
- Ahmad, M.W.; Reynolds, J.; Rezgui, Y. Predictive modelling for solar thermal energy systems: A comparison of support vector regression, random forest, extra trees and regression trees. J. Clean. Prod. 2018, 203, 810–821. [Google Scholar] [CrossRef]
- Shi, Y.; Song, X.; Song, G. Productivity prediction of a multilateral-well geothermal system based on a long short-term memory and multi-layer perceptron combinational neural network. Appl. Energy 2021, 282, 116046. [Google Scholar] [CrossRef]
- Ding, Y.; Zhang, Q.; Yuan, T.; Yang, K. Model input selection for building heating load prediction: A case study for an office building in Tianjin. Energy Build. 2018, 159, 254–270. [Google Scholar] [CrossRef]
- American Society Of Heating VAAC. Measurement of Energy and Demand Savings. ASHRAE Guidel 2014, 4, 1–150. [Google Scholar]
- Geron, A. Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems; O’Reilly Media: Sevastopol, CA, USA, 2017. [Google Scholar]
- Fan, C.; Xiao, F.; Zhao, Y. A short-term building cooling load prediction method using deep learning algorithms. Appl. Energy 2017, 195, 222–233. [Google Scholar] [CrossRef]
Reference | Input Features | Data-Driven Model | Prediction Results |
---|---|---|---|
[13] | Day of week; Hour of day; Holiday; Outdoor dry bulb temperature; Outdoor relative humidity | XGBoost; RF; SVR; LSTM | Metrics CVRMSE XGBoost: 21.1% RF: 23.7% SVR: 25.0% LSTM: 20.2% |
[21] | Total gross floor area; year of build; building height; shape form factor; vertical to horizontal ratio; length of the building; and width of the building; building morphology | LightGBM; XGBoost; RF; SVR | Metrics R2 LightGBM: 0.8608 XGBoost: 0.8137 RF: 0.7959 SVR: 0.7363 |
[22] | Historical load; weather data; calendar rules | LightGBM; XGBoost; SVM; RF | Stacking method XGBoost and LightGBM have obtained the higher accuracy |
[23] | Outdoor dry bulb temperature; Schedules | XGBoost; RF; SVR; ANN | Metrics CVRMSE XGBoost: 62% RF: 64% SVR: 64% ANN: 73% |
[24] | Outdoor dry bulb temperature; Outdoor relative humidity; Wind speed; Solar radiation; Hour of day | XGBoost; RF; SVR; ANN | Metrics CVRMSE XGBoost: 4.5% RF: 4.6% SVR: 5.5% ANN: 5.1% |
[25] | Relative compactness; Surface area; Wall area; Roof area; Number of floors; Orientation; Glazing area; Outdoor dry bulb temperature; Outdoor relative humidity; Solar radiation | ANN; SVM; RF; XGBoost | Metrics R2 XGBoost: 0.998 RF: 0.973 SVR: 0.972 ANN: 0.968 |
[26] | Aspect ratio; Relative compactness; Glazing area; Roof area; Surface area; Wall area; Orientation; Number of floors; Glazing area | ANN; SVR; RF | Metrics MAE (kW) ANN: 1.15 SVR: 0.90 RF: 1.45 |
Item | Description | ||
---|---|---|---|
Small Office | Medium Office | Large Office | |
Geometry | |||
Total Floor Area | 511 m2 | 4980 m2 | 46,321 m2 |
Exterior Wall Type | Wood frame walls | Steel frame walls | Mass walls |
Roof Type | Attic roof with wood joint | Built-up roof | Built-up roof |
Heating Type | Air source heat pump with gas furnace as backup | Gas furnace | One gas-fired boiler |
Cooling Type | Air source heat pump | Packaged air conditioning | Water direct expansion cooling coil Two water-cooled centrifugal chillers |
HVAC Operation Schedule | Weekdays: 6:00 am–7:00 pm | Weekdays: 6:00 am–10:00 pm Saturdays: 6:00 am–6:00 pm | Weekdays: 6:00 am–10:00 pm Saturdays: 6:00 am–6:00 pm |
No. | Input Feature Variables | Unit | Range | ||
---|---|---|---|---|---|
Small Office | Medium Office | Large Office | |||
1 | Dry Bulb | °C | [−32.8, 37.0] | [−32.8, 37.0] | [−32.8, 37.0] |
2 | Relative Humidity | % | [4, 100] | [4, 100] | [4, 100] |
3 | Global Horizontal Radiation | Wh/m2 | [0, 964] | [0, 964] | [0, 964] |
4 | Wind Speed | m/s | [0, 14.9] | [0, 14.9] | [0, 14.9] |
5 | Total Floor Area | m2 | [409, 613] | [3986, 5979] | [37,056, 55,584] |
6 | Aspect Ratio | - | [1.2, 1.8] | [1.2, 1.8] | [1.2, 1.8] |
7 | Window-to-Wall Ratio | - | [0.16, 0.24] | [0.26, 0.40] | [0.32, 0.48] |
8 | Floor Height | m | [2.44, 3.66] | [2.19, 3.29] | [2.20, 3.29] |
9 | Exterior Wall Insulation R-value | (m2·K)/W | [2.46, 3.68] | [2.25, 3.38] | [1.31, 1.97] |
10 | Roof Insulation R-value | (m2·K)/W | [6.48, 9.72] | [4.25, 6.37] | [4.25, 6.37] |
11 | Specific Heat for Internal Thermal Mass | J/(kg·K) | [968, 1452] | [968, 1452] | [968, 1452] |
12 | Cooling Temperature Set Point | °C | [22.78, 25.00] | [22.89, 25.11] | [22.89, 25.11] |
13 | Heating Temperature Set Point | °C | [20.00, 22.22] | [19.89, 22.11] | [19.89, 22.11] |
14 | Fresh air volume | m3/s-m2 | [0.000345, 0.000518] | [0.000345, 0.000518] | [0.000345, 0.000518] |
15 | People Density | m2/person | [13.27, 19.91] | [14.86, 22.29] | [14.86, 22.29] |
16 | Lighting Power Density | W/m2 | [6.80, 10.20] | [6.80, 10.20] | [6.80, 10.20] |
17 | Electric Equipment Power Density | W/m2 | [5.42, 8.14] | [6.46, 9.68] | [6.46, 9.68] |
Scenarios | Scenario 1: Six Input Features | Scenario 2: Nine Input Features | Scenario 3: Fifteen Input Features |
---|---|---|---|
Description | Weather condition | Weather condition and operational information | Weather condition, operational information, and physical parameters |
Input features | Hour of the day Historical load data Dry bulb temperature (°C) Relative humidity (%) Global horizontal radiation (W·h/m2) Wind speed (m/s) | Scenario 1 Cooling temperature Set point (°C) Heating temperature Set point (°C) Fresh air volume [m3/(s·m2)] | Scenario 2 R-value of wall [m2·K/W] Internal mass (average specific heat of the walls) [J/(kg·K)] Window to wall ratio Floor height (m) Shape coefficient (1/m) Aspect ratio |
Scenario 1 | Long Short-Term Memory (LSTM) (with) | LSTM (without) | LightGBM (with) | LightGBM (without) | Random Forest (RF) (with) | RF (without) |
---|---|---|---|---|---|---|
Root mean square error (RMSE) | 1.07 | 2.26 | 0.29 | 0.71 | 0.76 | 1.02 |
Coefficient of variance of RMSE (CVRMSE) | 26.15% | 55.32% | 7.14% | 17.40% | 18.59% | 24.85% |
R2 | 0.896968 | 0.538813 | 0.992327 | 0.954380 | 0.947891 | 0.906964 |
Computation Time(s) | 716.5 | 719.4 | 5.4 | 6.1 | 19.1 | 10.7 |
Scenario 2 | LSTM (with) | LSTM (without) | LightGBM (with) | LightGBM (without) | RF (with) | RF (without) |
---|---|---|---|---|---|---|
RMSE | 1.04 | 1.83 | 0.25 | 0.57 | 0.76 | 1.01 |
CVRMSE | 25.42% | 44.79% | 6.04% | 13.94% | 18.58% | 24.61% |
R2 | 0.902582 | 0.697689 | 0.994506 | 0.970706 | 0.947979 | 0.908734 |
Computation Time(s) | 765.0 | 780.0 | 6.5 | 5.7 | 29.2 | 22.6 |
Scenario 3 | LSTM (with) | LSTM (without) | LightGBM (with) | LightGBM (without) | RF (with) | RF (without) |
---|---|---|---|---|---|---|
RMSE | 0.90 | 1.63 | 0.21 | 0.3 | 0.76 | 0.98 |
CVRMSE | 22.06% | 39.75% | 5.25% | 7.31% | 18.54% | 23.88% |
R2 | 0.926666 | 0.761820 | 0.995854 | 0.991937 | 0.948189 | 0.914036 |
Computation Time(s) | 756.8 | 751.4 | 7.0 | 7.4 | 44.6 | 36.5 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, Y.; Ye, Y.; Liu, J.; Zhang, L.; Li, W.; Mohtaram, S. Machine Learning Approach to Predict Building Thermal Load Considering Feature Variable Dimensions: An Office Building Case Study. Buildings 2023, 13, 312. https://doi.org/10.3390/buildings13020312
Chen Y, Ye Y, Liu J, Zhang L, Li W, Mohtaram S. Machine Learning Approach to Predict Building Thermal Load Considering Feature Variable Dimensions: An Office Building Case Study. Buildings. 2023; 13(2):312. https://doi.org/10.3390/buildings13020312
Chicago/Turabian StyleChen, Yongbao, Yunyang Ye, Jingnan Liu, Lixin Zhang, Weilin Li, and Soheil Mohtaram. 2023. "Machine Learning Approach to Predict Building Thermal Load Considering Feature Variable Dimensions: An Office Building Case Study" Buildings 13, no. 2: 312. https://doi.org/10.3390/buildings13020312