Time Series Anomaly Detection Using Transformer-Based GAN With Two-Step Masking
Time series anomaly detection is a task that determines whether an unseen signal is normal or abnormal, and it is a crucial function in various real-world applications. Typical approach is to learn normal data representation using generative models, like Generative Adversarial Network (GAN), to disc...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2023-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10164104/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849233919278317568 |
|---|---|
| author | Ah-Hyung Shin Seong Tae Kim Gyeong-Moon Park |
| author_facet | Ah-Hyung Shin Seong Tae Kim Gyeong-Moon Park |
| author_sort | Ah-Hyung Shin |
| collection | DOAJ |
| description | Time series anomaly detection is a task that determines whether an unseen signal is normal or abnormal, and it is a crucial function in various real-world applications. Typical approach is to learn normal data representation using generative models, like Generative Adversarial Network (GAN), to discriminate between normal and abnormal signals. Recently, a few studies actively adopt Transformer to model time series data, but there is no pure Transformer-based GAN framework for time series anomaly detection. As a pioneer work, we propose a new pure Transformer-based GAN framework, called AnoFormer, and its effective training strategy for better representation learning. Specifically, we improve the detection ability of our model by introducing two-step masking strategies. The first step is Random masking: we design a random mask pool to hide parts of the signal randomly. This allows our model to learn the representation of normal data. The second step is Exclusive and Entropy-based Re-masking: we propose a novel refinement step to provide feedback to accurately model the exclusive and uncertain parts in the first step. We empirically demonstrate the effectiveness of re-masking step that generates more normal-like signals robustly. Extensive experiments on various datasets show that AnoFormer significantly outperforms the state-of-the-art methods in time series anomaly detection. |
| format | Article |
| id | doaj-art-a4acb3878c2e49d9b998250956f09001 |
| institution | Kabale University |
| issn | 2169-3536 |
| language | English |
| publishDate | 2023-01-01 |
| publisher | IEEE |
| record_format | Article |
| series | IEEE Access |
| spelling | doaj-art-a4acb3878c2e49d9b998250956f090012025-08-20T04:03:21ZengIEEEIEEE Access2169-35362023-01-0111740357404710.1109/ACCESS.2023.328992110164104Time Series Anomaly Detection Using Transformer-Based GAN With Two-Step MaskingAh-Hyung Shin0https://orcid.org/0000-0003-1015-6167Seong Tae Kim1https://orcid.org/0000-0002-2132-6021Gyeong-Moon Park2https://orcid.org/0000-0003-4011-9981Department of Computer Science and Engineering, Kyung Hee University, Yongin-si, South KoreaDepartment of Computer Science and Engineering, Kyung Hee University, Yongin-si, South KoreaDepartment of Computer Science and Engineering, Kyung Hee University, Yongin-si, South KoreaTime series anomaly detection is a task that determines whether an unseen signal is normal or abnormal, and it is a crucial function in various real-world applications. Typical approach is to learn normal data representation using generative models, like Generative Adversarial Network (GAN), to discriminate between normal and abnormal signals. Recently, a few studies actively adopt Transformer to model time series data, but there is no pure Transformer-based GAN framework for time series anomaly detection. As a pioneer work, we propose a new pure Transformer-based GAN framework, called AnoFormer, and its effective training strategy for better representation learning. Specifically, we improve the detection ability of our model by introducing two-step masking strategies. The first step is Random masking: we design a random mask pool to hide parts of the signal randomly. This allows our model to learn the representation of normal data. The second step is Exclusive and Entropy-based Re-masking: we propose a novel refinement step to provide feedback to accurately model the exclusive and uncertain parts in the first step. We empirically demonstrate the effectiveness of re-masking step that generates more normal-like signals robustly. Extensive experiments on various datasets show that AnoFormer significantly outperforms the state-of-the-art methods in time series anomaly detection.https://ieeexplore.ieee.org/document/10164104/Anomaly detectionmaskingself-attentionsignal reconstructiontransformertime series analysis |
| spellingShingle | Ah-Hyung Shin Seong Tae Kim Gyeong-Moon Park Time Series Anomaly Detection Using Transformer-Based GAN With Two-Step Masking IEEE Access Anomaly detection masking self-attention signal reconstruction transformer time series analysis |
| title | Time Series Anomaly Detection Using Transformer-Based GAN With Two-Step Masking |
| title_full | Time Series Anomaly Detection Using Transformer-Based GAN With Two-Step Masking |
| title_fullStr | Time Series Anomaly Detection Using Transformer-Based GAN With Two-Step Masking |
| title_full_unstemmed | Time Series Anomaly Detection Using Transformer-Based GAN With Two-Step Masking |
| title_short | Time Series Anomaly Detection Using Transformer-Based GAN With Two-Step Masking |
| title_sort | time series anomaly detection using transformer based gan with two step masking |
| topic | Anomaly detection masking self-attention signal reconstruction transformer time series analysis |
| url | https://ieeexplore.ieee.org/document/10164104/ |
| work_keys_str_mv | AT ahhyungshin timeseriesanomalydetectionusingtransformerbasedganwithtwostepmasking AT seongtaekim timeseriesanomalydetectionusingtransformerbasedganwithtwostepmasking AT gyeongmoonpark timeseriesanomalydetectionusingtransformerbasedganwithtwostepmasking |