Research on equipment fault diagnosis model based on gan and inverse PINN: Solutions for data imbalance and rare faults.

In the field of medical imaging equipment, fault diagnosis plays a vital role in guaranteeing stable operation and prolonging service life. Traditional diagnostic approaches, though, are confronted with issues like intricate fault modes, as well as scarce and imbalanced data. This paper puts forward...

Full description

Saved in:
Bibliographic Details
Main Authors: Jian Deng, Zheng Cheng, Aiming Gu, Shibohua Zhang
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2025-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0324180
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In the field of medical imaging equipment, fault diagnosis plays a vital role in guaranteeing stable operation and prolonging service life. Traditional diagnostic approaches, though, are confronted with issues like intricate fault modes, as well as scarce and imbalanced data. This paper puts forward a fault diagnosis model integrating digital twin technology and Inverse Physics - Informed Neural Networks (Inverse PINN).The practical significance of this research lies in its potential to revolutionize the engineering aspects of medical imaging equipment management. By constructing a physical model of equipment operation and leveraging inverse PINN to deal with imbalanced datasets, the model can accurately identify and predict potential faults. This not only optimizes the full lifecycle management of the equipment but also has the potential to reduce maintenance costs, improve equipment availability, and enhance the overall efficiency of medical imaging services.Experimental results show that the proposed model outperforms in fault detection and prediction for medical imaging equipment, especially making breakthroughs in data generation and fault detection accuracy. Finally, the paper discusses the model's limitations and future development directions.
ISSN:1932-6203