Local-Global and Multi-Scale (LG-MS) Mixer Architecture for Long-Term Time Series Forecasting
Although deep learning models dominate time series forecasting, they still struggle with long-sequence processing due to the challenges of extracting dynamic fluctuations and pattern features as input length increases. To address this challenge, we propose a framework – LG-MSMixer...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10818690/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832592935916404736 |
---|---|
author | Zhennan Peng Boyong Gao Ziqi Xia Jie Liu |
author_facet | Zhennan Peng Boyong Gao Ziqi Xia Jie Liu |
author_sort | Zhennan Peng |
collection | DOAJ |
description | Although deep learning models dominate time series forecasting, they still struggle with long-sequence processing due to the challenges of extracting dynamic fluctuations and pattern features as input length increases. To address this challenge, we propose a framework – LG-MSMixer—to enhance long-term time series forecasting through three key steps: multi-scale dual decomposition, local-global information extraction, and fusion prediction. Specifically, we first conduct multi-scale dual decomposition of the long input sequence to derive a seasonal-trend component combination. To capture a more comprehensive effective information within the components, we then utilize a customized patch-based triple attention local-global information extractor that models both temporal feature information and variable dependencies, alongside an MLP-based feature interaction iterator facilitating interactions among multi-scale information to guide macro-level predictions. Finally, we integrate the predictions from the multi-scale sequences to leverage their complementary advantages. In our experiments, we demonstrate the effectiveness of LG-MSMixer across various real-world long-term forecasting tasks, significantly outperforming previous baselines. |
format | Article |
id | doaj-art-8174c13fc25c4659a4e24d66fe2b93f1 |
institution | Kabale University |
issn | 2169-3536 |
language | English |
publishDate | 2025-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj-art-8174c13fc25c4659a4e24d66fe2b93f12025-01-21T00:01:53ZengIEEEIEEE Access2169-35362025-01-01139199920810.1109/ACCESS.2024.352449910818690Local-Global and Multi-Scale (LG-MS) Mixer Architecture for Long-Term Time Series ForecastingZhennan Peng0https://orcid.org/0009-0004-7774-9092Boyong Gao1https://orcid.org/0000-0002-3925-5997Ziqi Xia2https://orcid.org/0009-0003-6421-9298Jie Liu3https://orcid.org/0009-0009-7669-1796College of Information Engineering, China Jiliang University, Hangzhou, ChinaCollege of Information Engineering, China Jiliang University, Hangzhou, ChinaCollege of Information Engineering, China Jiliang University, Hangzhou, ChinaCollege of Information Engineering, China Jiliang University, Hangzhou, ChinaAlthough deep learning models dominate time series forecasting, they still struggle with long-sequence processing due to the challenges of extracting dynamic fluctuations and pattern features as input length increases. To address this challenge, we propose a framework – LG-MSMixer—to enhance long-term time series forecasting through three key steps: multi-scale dual decomposition, local-global information extraction, and fusion prediction. Specifically, we first conduct multi-scale dual decomposition of the long input sequence to derive a seasonal-trend component combination. To capture a more comprehensive effective information within the components, we then utilize a customized patch-based triple attention local-global information extractor that models both temporal feature information and variable dependencies, alongside an MLP-based feature interaction iterator facilitating interactions among multi-scale information to guide macro-level predictions. Finally, we integrate the predictions from the multi-scale sequences to leverage their complementary advantages. In our experiments, we demonstrate the effectiveness of LG-MSMixer across various real-world long-term forecasting tasks, significantly outperforming previous baselines.https://ieeexplore.ieee.org/document/10818690/Deep learninglong-term time series forecastinginformation extractionlocal-globalmulti-scale decomposition |
spellingShingle | Zhennan Peng Boyong Gao Ziqi Xia Jie Liu Local-Global and Multi-Scale (LG-MS) Mixer Architecture for Long-Term Time Series Forecasting IEEE Access Deep learning long-term time series forecasting information extraction local-global multi-scale decomposition |
title | Local-Global and Multi-Scale (LG-MS) Mixer Architecture for Long-Term Time Series Forecasting |
title_full | Local-Global and Multi-Scale (LG-MS) Mixer Architecture for Long-Term Time Series Forecasting |
title_fullStr | Local-Global and Multi-Scale (LG-MS) Mixer Architecture for Long-Term Time Series Forecasting |
title_full_unstemmed | Local-Global and Multi-Scale (LG-MS) Mixer Architecture for Long-Term Time Series Forecasting |
title_short | Local-Global and Multi-Scale (LG-MS) Mixer Architecture for Long-Term Time Series Forecasting |
title_sort | local global and multi scale lg ms mixer architecture for long term time series forecasting |
topic | Deep learning long-term time series forecasting information extraction local-global multi-scale decomposition |
url | https://ieeexplore.ieee.org/document/10818690/ |
work_keys_str_mv | AT zhennanpeng localglobalandmultiscalelgmsmixerarchitectureforlongtermtimeseriesforecasting AT boyonggao localglobalandmultiscalelgmsmixerarchitectureforlongtermtimeseriesforecasting AT ziqixia localglobalandmultiscalelgmsmixerarchitectureforlongtermtimeseriesforecasting AT jieliu localglobalandmultiscalelgmsmixerarchitectureforlongtermtimeseriesforecasting |