Interpretable Deep Learning for Pneumonia Detection Using Chest X-Ray Images
Pneumonia remains a global health issue, creating the need for accurate detection methods for effective treatment. Deep learning models like ResNet50 show promise in detecting pneumonia from chest X-rays; however, their black-box nature limits the transparency, which fails to meet that needed for cl...
Saved in:
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Information |
Subjects: | |
Online Access: | https://www.mdpi.com/2078-2489/16/1/53 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Pneumonia remains a global health issue, creating the need for accurate detection methods for effective treatment. Deep learning models like ResNet50 show promise in detecting pneumonia from chest X-rays; however, their black-box nature limits the transparency, which fails to meet that needed for clinical trust. This study aims to improve model interpretability by comparing four interpretability techniques, which are Layer-wise Relevance Propagation (LRP), Adversarial Training, Class Activation Maps (CAMs), and the Spatial Attention Mechanism, and determining which fits best the model, enhancing its transparency with minimal impact on its performance. Each technique was evaluated for its impact on the accuracy, sensitivity, specificity, AUC-ROC, Mean Relevance Score (MRS), and a calculated trade-off score that balances interpretability and performance. The results indicate that LRP was the most effective in enhancing interpretability, achieving high scores across all metrics without sacrificing diagnostic accuracy. The model achieved 0.91 accuracy and 0.85 interpretability (MRS), demonstrating its potential for clinical integration. In contrast, Adversarial Training, CAMs, and the Spatial Attention Mechanism showed trade-offs between interpretability and performance, each highlighting unique image features but with some impact on specificity and accuracy. |
---|---|
ISSN: | 2078-2489 |