Open this publication in new window or tab >>Show others...
2025 (English)Conference paper, Published paper (Refereed)
Abstract [en]
Accurate short-term load forecasting is essential for energy-efficient building operations, however, many buildings lack sufficient historical data to train reliable predictive models. This paper investigates the use of instance-based transfer learning to improve forecasting performance in data-scarce buildings by reusing models trained on similar, data-rich buildings. We cluster buildings of various types based on their energy consumption profiles using dynamic time warping and hierarchical clustering. Within each cluster, we fine-tune XGBoost and LSTM models pretrained on source buildings (with sufficient data) to forecast the consumption in target buildings (with limited data) with access to increasing percentage of their original limited data of 7, 14, 21, 28, ... , 245 days. Our evaluation compares the performance against models trained from scratch using the same limited data. Our results show that transfer learning significantly improves forecasting accuracy in highly data-constrained settings, reducing RMSE by up to 40%. However, as more training data becomes available, its benefits diminish, and baseline models eventually outperform transferred ones beyond an average threshold of 140–168 days.
Keywords
Transfer Learning, Load Forecasting, Energy consumption Prediction, Data Scarcity, Cold start
National Category
Computer and Information Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kau:diva-106761 (URN)
Conference
IEEE PES Innovative Smart Grid Technologies Europe (ISGT EUROPE) 2025, October 20th – 23rd 2025
2025-09-032025-09-032025-10-16Bibliographically approved