Automated detection of weld defects in TOFD images for steel bridges using generative adversarial networks

Time-of-flight diffraction (TOFD) has been widely adopted for weld defect detection in bridge steel quality assurance due to its harmlessness to the human body, real-time performance, and satisfactory detection accuracy. Most deep learning-based methods for automated weld defect recognition rely hea...

Full description

Saved in:
Bibliographic Details
Main Authors: Yanfeng Gong, Zihao Chen, Hong Zhang, Meng Xu, Wen Deng
Format: Article
Language:English
Published: Elsevier 2025-07-01
Series:Case Studies in Construction Materials
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2214509525006394
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Time-of-flight diffraction (TOFD) has been widely adopted for weld defect detection in bridge steel quality assurance due to its harmlessness to the human body, real-time performance, and satisfactory detection accuracy. Most deep learning-based methods for automated weld defect recognition rely heavily on defect samples. In practice, such samples are scarce in steel bridge applications because modern welding technologies significantly reduce defect occurrence. To address this limitation, we propose a two-stage defect detection method for TOFD weld images based on an enhanced generative adversarial network (GAN) that operates without requiring defect-containing samples. In the first stage, the Region of Interest (ROI) which contains potential defects is localized using YOLOv8. In the second stage, the extracted ROI is sliced into patches and analyzed by a GAN architecture enhanced with a self-attention mechanism, which improves the encoding and aggregation of local defect features. The integration of the self-attention GAN with the slicing strategy further enhances defect recognition performance. The proposed method is evaluated on a self-constructed TOFD dataset of steel bridge welds. Experimental results demonstrate that our approach achieves an AUC of 86 %, outperforming existing state-of-the-art methods by 5 %.
ISSN:2214-5095