Heuristic based federated learning with adaptive hyperparameter tuning for households energy prediction
Abstract Federated Learning is transforming electrical load forecasting by enabling Artificial Intelligence (AI) models to be trained directly on household edge devices. However, the prediction accuracy of federated learning models tends to diminish when dealing with non-IID data highlighting the ne...
Saved in:
| Main Authors: | , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-04-01
|
| Series: | Scientific Reports |
| Subjects: | |
| Online Access: | https://doi.org/10.1038/s41598-025-96443-3 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract Federated Learning is transforming electrical load forecasting by enabling Artificial Intelligence (AI) models to be trained directly on household edge devices. However, the prediction accuracy of federated learning models tends to diminish when dealing with non-IID data highlighting the need for adaptive hyperparameter optimization strategies to improve performance. In this paper, we propose a novel hierarchical federated learning solution for efficient model aggregation and hyperparameter tuning, specifically tailored to household energy prediction. The households with similar energy profiles are clustered at the edge, linked, and aggregated at the fog level, to enable effective and adaptive hyperparameter tuning. The federated model aggregation is optimized using hierarchical simulated annealing optimization to prioritize updates from the better-performing models. A genetic algorithm-based hyperparameter optimization method reduces the computational load on edge nodes by efficiently exploring different configurations and using only the most promising ones for edge nodes’ cross-validation. The evaluation results demonstrate a significant improvement in average prediction accuracy and better capturing of energy patterns compared to the federated averaging approach. The impact on network traffic among nodes across different layers is kept below 30 KB. Additionally, hyperparameter tuning reduces the size of model updates and the number of communication rounds by 30%, which is particularly beneficial when network resources are limited. |
|---|---|
| ISSN: | 2045-2322 |