A Lightweight Transformer Edge Intelligence Model for RUL Prediction Classification

Remaining Useful Life (RUL) prediction is a crucial task in predictive maintenance. Currently, gated recurrent networks, hybrid models, and attention-enhanced models used for predictive maintenance face the challenge of balancing prediction accuracy and model lightweighting when extracting complex d...

Full description

Saved in:
Bibliographic Details
Main Authors: Lilu Wang, Yongqi Li, Haiyuan Liu, Taihui Liu
Format: Article
Language:English
Published: MDPI AG 2025-07-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/13/4224
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Remaining Useful Life (RUL) prediction is a crucial task in predictive maintenance. Currently, gated recurrent networks, hybrid models, and attention-enhanced models used for predictive maintenance face the challenge of balancing prediction accuracy and model lightweighting when extracting complex degradation features. This limitation hinders their deployment on resource-constrained edge devices. To address this issue, we propose TBiGNet, a lightweight Transformer-based classification network model for RUL prediction. TBiGNet features an encoder–decoder architecture that outperforms traditional Transformer models by achieving over 15% higher accuracy while reducing computational load, memory access, and parameter size by more than 98%. In the encoder, we optimize the attention mechanism by integrating the individual linear mappings of queries, keys, and values into an efficient operation, reducing memory access overhead by 60%. Additionally, an adaptive feature pruning module is introduced to dynamically select critical features based on their importance, reducing redundancy and enhancing model accuracy by 6%. The decoder innovatively fuses two different types of features and leverages BiGRU to compensate for the limitations of the attention mechanism in capturing degradation features, resulting in a 7% accuracy improvement. Extensive experiments on the C-MAPSS dataset demonstrate that TBiGNet surpasses existing methods in terms of computational accuracy, model size, and memory access, showcasing significant technical advantages and application potential. Experiments on the C-MPASS dataset show that TBiGNet is superior to the existing methods in terms of calculation accuracy, model size and throughput, showing significant technical advantages and application potential.
ISSN:1424-8220