Auto-embedding transformer under multi-source information fusion for few-shot fault diagnosis

Abstract Data-driven intelligent fault diagnosis methods have become essential for ensuring the reliability and stability of mechanical systems. However, their practical application is often hindered by the scarcity of labeled samples and the absence of effective multi-source information fusion stra...

Full description

Saved in:
Bibliographic Details
Main Authors: Bo Wang, Shuai Zhao, Qian Zhao, Yang Bai
Format: Article
Language:English
Published: Nature Portfolio 2025-07-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-10124-9
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849388095096487936
author Bo Wang
Shuai Zhao
Qian Zhao
Yang Bai
author_facet Bo Wang
Shuai Zhao
Qian Zhao
Yang Bai
author_sort Bo Wang
collection DOAJ
description Abstract Data-driven intelligent fault diagnosis methods have become essential for ensuring the reliability and stability of mechanical systems. However, their practical application is often hindered by the scarcity of labeled samples and the absence of effective multi-source information fusion strategies, which collectively limit the accuracy of existing fault diagnosis frameworks. To address these challenges, we propose a novel auto-embedding transformer named EDformer, tailored for multi-source information under few-shot fault diagnosis. First, the multi-source information is fed into a novel encoder–decoder to extract high-quality embeddings, thereby mitigating the challenges posed by limited samples in real-world engineering applications. Subsequently, an innovative cross-attention architecture leveraging Transformer neural networks is proposed to facilitate efficient multi-modal data integration by highlighting key correlations between sensing devices while minimizing superfluous information. In the final stage, the architecture integrates global max pooling and global average pooling operations to optimize feature abstraction and improve resilience to data variations. The effectiveness of the proposed framework is validated through comprehensive evaluations on two heterogeneous datasets. Diagnostic results demonstrate that EDformer surpasses contemporary approaches in both classification accuracy and stability, particularly under conditions of data scarcity. Visualization tools such as t-SNE and ROC curves further confirm its ability to effectively distinguish fault categories and capture critical fault-related features.
format Article
id doaj-art-7ba0eccfc05f47248c42debfdf68b0ba
institution Kabale University
issn 2045-2322
language English
publishDate 2025-07-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-7ba0eccfc05f47248c42debfdf68b0ba2025-08-20T03:42:25ZengNature PortfolioScientific Reports2045-23222025-07-0115111710.1038/s41598-025-10124-9Auto-embedding transformer under multi-source information fusion for few-shot fault diagnosisBo Wang0Shuai Zhao1Qian Zhao2Yang Bai3School of Information and Artificial Intelligence, Nanchang Institute of Science & TechnologySchool of Information and Artificial Intelligence, Nanchang Institute of Science & TechnologyDepartment of Aerospace Science and Technology, Politecnico di MilanoSchool of Education, Nanchang Institute of Science & TechnologyAbstract Data-driven intelligent fault diagnosis methods have become essential for ensuring the reliability and stability of mechanical systems. However, their practical application is often hindered by the scarcity of labeled samples and the absence of effective multi-source information fusion strategies, which collectively limit the accuracy of existing fault diagnosis frameworks. To address these challenges, we propose a novel auto-embedding transformer named EDformer, tailored for multi-source information under few-shot fault diagnosis. First, the multi-source information is fed into a novel encoder–decoder to extract high-quality embeddings, thereby mitigating the challenges posed by limited samples in real-world engineering applications. Subsequently, an innovative cross-attention architecture leveraging Transformer neural networks is proposed to facilitate efficient multi-modal data integration by highlighting key correlations between sensing devices while minimizing superfluous information. In the final stage, the architecture integrates global max pooling and global average pooling operations to optimize feature abstraction and improve resilience to data variations. The effectiveness of the proposed framework is validated through comprehensive evaluations on two heterogeneous datasets. Diagnostic results demonstrate that EDformer surpasses contemporary approaches in both classification accuracy and stability, particularly under conditions of data scarcity. Visualization tools such as t-SNE and ROC curves further confirm its ability to effectively distinguish fault categories and capture critical fault-related features.https://doi.org/10.1038/s41598-025-10124-9Fault diagnosisEncoder–decoderMulti-source information fusionTransformer
spellingShingle Bo Wang
Shuai Zhao
Qian Zhao
Yang Bai
Auto-embedding transformer under multi-source information fusion for few-shot fault diagnosis
Scientific Reports
Fault diagnosis
Encoder–decoder
Multi-source information fusion
Transformer
title Auto-embedding transformer under multi-source information fusion for few-shot fault diagnosis
title_full Auto-embedding transformer under multi-source information fusion for few-shot fault diagnosis
title_fullStr Auto-embedding transformer under multi-source information fusion for few-shot fault diagnosis
title_full_unstemmed Auto-embedding transformer under multi-source information fusion for few-shot fault diagnosis
title_short Auto-embedding transformer under multi-source information fusion for few-shot fault diagnosis
title_sort auto embedding transformer under multi source information fusion for few shot fault diagnosis
topic Fault diagnosis
Encoder–decoder
Multi-source information fusion
Transformer
url https://doi.org/10.1038/s41598-025-10124-9
work_keys_str_mv AT bowang autoembeddingtransformerundermultisourceinformationfusionforfewshotfaultdiagnosis
AT shuaizhao autoembeddingtransformerundermultisourceinformationfusionforfewshotfaultdiagnosis
AT qianzhao autoembeddingtransformerundermultisourceinformationfusionforfewshotfaultdiagnosis
AT yangbai autoembeddingtransformerundermultisourceinformationfusionforfewshotfaultdiagnosis