Enhanced household energy consumption forecasting using multivariate long short-term memory (LSTM) networks with weather data integration
Forecasting household energy consumption is crucial for sustainable energy management and grid stability. Traditional models often struggle with complex data characteristics. This study introduces a TinyML-optimized multivariate LSTM model designed for resource-constrained environments, leveraging s...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Elsevier
2025-09-01
|
| Series: | Results in Engineering |
| Subjects: | |
| Online Access: | http://www.sciencedirect.com/science/article/pii/S2590123025025812 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Forecasting household energy consumption is crucial for sustainable energy management and grid stability. Traditional models often struggle with complex data characteristics. This study introduces a TinyML-optimized multivariate LSTM model designed for resource-constrained environments, leveraging self-supervised learning to enhance predictive accuracy with minimal computational overhead. By integrating environmental factors like weather, the model offers a scalable solution for smart home applications, achieving significant improvements in key metrics while operating within the memory and power limitations typical of edge computing devices. This research contributes to the state of the art by using a structured data preprocessing pipeline that includes normalization, day resampling, and advanced feature engineering methods to prepare the input for LSTM networks in an optimal way. This work initiates the application of Self-Supervised Learning (SSL) for preprocessing and enriching weather-related features from remote sensing data to enhance the predictive capability of the LSTM. Empirical outcomes prove a considerable improvement over traditional forecasting models, which reach a Mean Squared Error (MSE) of 0.02063 and a Root Mean Squared Error (RMSE) of 0.14363, a Mean Absolute Error (MAE) of 0.107, a Mean Absolute Percentage Error (MAPE) of 0.155, and a Coefficient of Determination (R²) of 0.724. This work uniquely combines Self-Supervised Learning (SSL) with multivariate LSTM models, significantly improving feature extraction and alleviating limitations witnessed in current hybrid models, most notably with respect to handling intricate temporal dependencies and mitigating computational overhead within resource-limited settings. |
|---|---|
| ISSN: | 2590-1230 |