Constructing Attention-LSTM-VAE Power Load Model Based on Multiple Features
With the complexity of modern power system and the susceptibility to external weather influences, it brings challenges to build an accurate load model. This paper proposes a variational autoencoder (VAE) long short-term memory (LSTM) load model based on the attention mechanism (Attention). First, th...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2024-01-01
|
Series: | Advances in Mathematical Physics |
Online Access: | http://dx.doi.org/10.1155/2024/1041791 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832559509416968192 |
---|---|
author | Chaoyue Ma Ying Wang Feng Li Huiyan Zhang Yong Zhang Haiyan Zhang |
author_facet | Chaoyue Ma Ying Wang Feng Li Huiyan Zhang Yong Zhang Haiyan Zhang |
author_sort | Chaoyue Ma |
collection | DOAJ |
description | With the complexity of modern power system and the susceptibility to external weather influences, it brings challenges to build an accurate load model. This paper proposes a variational autoencoder (VAE) long short-term memory (LSTM) load model based on the attention mechanism (Attention). First, the Prophet data decomposition method is used to decompose long sequences of load data at multiple time scales. Second, the correlation-based feature selection with maximum information coefficient (CFS-MIC) method is employed to select weather features based on their relevance, a subset of features with high correlation and low redundancy is chosen as model inputs. Finally, the Attention-LSTM-VAE model is constructed to capture the temporal variations laws of load. The dataset includes 2 years of load values and weather data collected in Caojiaping, Hunan Province, China. The experimental results show that the Attention-LSTM-VAE model has the lowest mean absolute error of 0.0374 and the highest R-squared value of 0.9714, verifying the accuracy of the model. Therefore, the performance of the Attention-LSTM-VAE model is better than the general deep learning load models, which has important reference for the research of power load models. Comparisons with other deep learning methods, the experimental results show that the Attention-LSTM-VAE model has the lowest mean absolute error of 0.0374 and the highest R-squared value of 0.9714. The Attention-LSTM-VAE has better robustness, stability, and accuracy in load modeling, which has an important reference for the research of power load models. |
format | Article |
id | doaj-art-b49d8c9291ed43ecada521905f6aeae8 |
institution | Kabale University |
issn | 1687-9139 |
language | English |
publishDate | 2024-01-01 |
publisher | Wiley |
record_format | Article |
series | Advances in Mathematical Physics |
spelling | doaj-art-b49d8c9291ed43ecada521905f6aeae82025-02-03T01:29:51ZengWileyAdvances in Mathematical Physics1687-91392024-01-01202410.1155/2024/1041791Constructing Attention-LSTM-VAE Power Load Model Based on Multiple FeaturesChaoyue Ma0Ying Wang1Feng Li2Huiyan Zhang3Yong Zhang4Haiyan Zhang5School of Computer and Artificial IntelligenceCentral China Branch of State Grid Corporation of ChinaCentral China Branch of State Grid Corporation of ChinaSchool of Computer and Artificial IntelligenceSchool of Computer and Artificial IntelligenceSchool of Computer and Artificial IntelligenceWith the complexity of modern power system and the susceptibility to external weather influences, it brings challenges to build an accurate load model. This paper proposes a variational autoencoder (VAE) long short-term memory (LSTM) load model based on the attention mechanism (Attention). First, the Prophet data decomposition method is used to decompose long sequences of load data at multiple time scales. Second, the correlation-based feature selection with maximum information coefficient (CFS-MIC) method is employed to select weather features based on their relevance, a subset of features with high correlation and low redundancy is chosen as model inputs. Finally, the Attention-LSTM-VAE model is constructed to capture the temporal variations laws of load. The dataset includes 2 years of load values and weather data collected in Caojiaping, Hunan Province, China. The experimental results show that the Attention-LSTM-VAE model has the lowest mean absolute error of 0.0374 and the highest R-squared value of 0.9714, verifying the accuracy of the model. Therefore, the performance of the Attention-LSTM-VAE model is better than the general deep learning load models, which has important reference for the research of power load models. Comparisons with other deep learning methods, the experimental results show that the Attention-LSTM-VAE model has the lowest mean absolute error of 0.0374 and the highest R-squared value of 0.9714. The Attention-LSTM-VAE has better robustness, stability, and accuracy in load modeling, which has an important reference for the research of power load models.http://dx.doi.org/10.1155/2024/1041791 |
spellingShingle | Chaoyue Ma Ying Wang Feng Li Huiyan Zhang Yong Zhang Haiyan Zhang Constructing Attention-LSTM-VAE Power Load Model Based on Multiple Features Advances in Mathematical Physics |
title | Constructing Attention-LSTM-VAE Power Load Model Based on Multiple Features |
title_full | Constructing Attention-LSTM-VAE Power Load Model Based on Multiple Features |
title_fullStr | Constructing Attention-LSTM-VAE Power Load Model Based on Multiple Features |
title_full_unstemmed | Constructing Attention-LSTM-VAE Power Load Model Based on Multiple Features |
title_short | Constructing Attention-LSTM-VAE Power Load Model Based on Multiple Features |
title_sort | constructing attention lstm vae power load model based on multiple features |
url | http://dx.doi.org/10.1155/2024/1041791 |
work_keys_str_mv | AT chaoyuema constructingattentionlstmvaepowerloadmodelbasedonmultiplefeatures AT yingwang constructingattentionlstmvaepowerloadmodelbasedonmultiplefeatures AT fengli constructingattentionlstmvaepowerloadmodelbasedonmultiplefeatures AT huiyanzhang constructingattentionlstmvaepowerloadmodelbasedonmultiplefeatures AT yongzhang constructingattentionlstmvaepowerloadmodelbasedonmultiplefeatures AT haiyanzhang constructingattentionlstmvaepowerloadmodelbasedonmultiplefeatures |