A federated LSTM network for load forecasting using multi-source data with homomorphic encryption

Short-term load forecasting is of great significance to the operation of power systems. Various uncertain factors, such as meteorological social data, have already been combined with historical power data to create more accurate load forecasting models. In traditional systems, data from various indu...

Full description

Saved in:
Bibliographic Details
Main Authors: Mengdi Wang, Rui Xin, Mingrui Xia, Zhifeng Zuo, Yinyin Ge, Pengfei Zhang, Hongxing Ye
Format: Article
Language:English
Published: AIMS Press 2025-03-01
Series:AIMS Energy
Subjects:
Online Access:https://www.aimspress.com/article/doi/10.3934/energy.2025011
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Short-term load forecasting is of great significance to the operation of power systems. Various uncertain factors, such as meteorological social data, have already been combined with historical power data to create more accurate load forecasting models. In traditional systems, data from various industries and regions are centralized for knowledge extraction. However, concerns regarding data security and privacy often prevent industries from sharing their data, limiting both the quantity and diversity of data available for forecasting models. These challenges drive the adoption of federated learning (FL) to address issues related to data silos and privacy. In this paper, a novel framework for short-term load forecasting was proposed using historical data from industries such as power, meteorology, and finance. Long short-term memory (LSTM) networks were utilized for forecasting, and federated learning (FL) was implemented to protect data privacy. FL allows clients in multiple regions to collaboratively train a shared model without exposing their data. To further enhance security, the homomorphic encryption (HE) using Paillier algorithm was introduced during the federated process. Experimental results demonstrate that the federated model, which extracts knowledge from different regions, outperforms locally trained models. Furthermore, longer HE keys have little effect on predictive performance but significantly slow down encryption and decryption, thereby increasing training time.
ISSN:2333-8334