Attention-based multi-residual network for lung segmentation in diseased lungs with custom data augmentation

Abstract Lung disease analysis in chest X-rays (CXR) using deep learning presents significant challenges due to the wide variation in lung appearance caused by disease progression and differing X-ray settings. While deep learning models have shown remarkable success in segmenting lungs from CXR imag...

Full description

Saved in:
Bibliographic Details
Main Authors: Md. Shariful Alam, Dadong Wang, Yulia Arzhaeva, Jesse Alexander Ende, Joanna Kao, Liz Silverstone, Deborah Yates, Olivier Salvado, Arcot Sowmya
Format: Article
Language:English
Published: Nature Portfolio 2024-11-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-024-79494-w
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Lung disease analysis in chest X-rays (CXR) using deep learning presents significant challenges due to the wide variation in lung appearance caused by disease progression and differing X-ray settings. While deep learning models have shown remarkable success in segmenting lungs from CXR images with normal or mildly abnormal findings, their performance declines when faced with complex structures, such as pulmonary opacifications. In this study, we propose AMRU++, an attention-based multi-residual UNet++ network designed for robust and accurate lung segmentation in CXR images with both normal and severe abnormalities. The model incorporates attention modules to capture relevant spatial information and multi-residual blocks to extract rich contextual and discriminative features of lung regions. To further enhance segmentation performance, we introduce a data augmentation technique that simulates the features and characteristics of CXR pathologies, addressing the issue of limited annotated data. Extensive experiments on public and private datasets comprising 350 cases of pneumoconiosis, COVID-19, and tuberculosis validate the effectiveness of our proposed framework and data augmentation technique.
ISSN:2045-2322