MD-Former: Multiscale Dual Branch Transformer for Multivariate Time Series Classification

Multivariate Time Series Classification (MTSC) is a challenging task in real-world applications. Current approaches emphasize modeling multiscale relationships over time. However, the Multivariate Time Series (MTS) also exhibits multiscale cross-channel relationships. Furthermore, the long-term temp...

Full description

Saved in:
Bibliographic Details
Main Authors: Yanling Du, Shuhao Chu, Jintao Wang, Manli Shi, Dongmei Huang, Wei Song
Format: Article
Language:English
Published: MDPI AG 2025-02-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/5/1487
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multivariate Time Series Classification (MTSC) is a challenging task in real-world applications. Current approaches emphasize modeling multiscale relationships over time. However, the Multivariate Time Series (MTS) also exhibits multiscale cross-channel relationships. Furthermore, the long-term temporal relationships in time series are difficult to capture. In this paper, we introduce MD-Former, a Multiscale Dual-Branch Attention network leveraging the Transformer architecture to capture multiscale relationships across time and channels for MTSC. In MD-Former, MTS is embedded into 2D vectors using Channel-Patching (CP) to retain channel information. Following this, we develop two branches: the Interlaced Attention Branch (IAB) and the Channel-Independent Attention Branch (CIAB). The IAB facilitates the fusion of information across channels and time, while the CIAB prevents the loss of information resulting from excessive fusion. Both the IAB and CIAB consist of multiple layers, each representing a distinct time scale. Finally, we utilize features from each layer of both IAB and CIAB as inputs to the Multiscale Classification Head (MCH) for feature fusion and classification. Experimental results show that MD-Former achieves performance levels that are comparable to SOTA methods in MTSC.
ISSN:1424-8220