SCRED-Distillation: Improving Low-Dose CT Image Quality via Feature Fusion and Mutual Learning

The substantial noise inherent in low-dose CT (LDCT) significantly impedes diagnostic accuracy. Although deep learning techniques, particularly CNNs, have offered promise for LDCT denoising, their inherent focus on local features and the scarcity of extensive training data can limit their performanc...

Full description

Saved in:
Bibliographic Details
Main Authors: Yanqing Wang, Xinru Zhan, Wanquan Liu, Yingying Li, Kexin Guo, Huafeng Wang
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11062456/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849320938355556352
author Yanqing Wang
Xinru Zhan
Wanquan Liu
Yingying Li
Kexin Guo
Huafeng Wang
author_facet Yanqing Wang
Xinru Zhan
Wanquan Liu
Yingying Li
Kexin Guo
Huafeng Wang
author_sort Yanqing Wang
collection DOAJ
description The substantial noise inherent in low-dose CT (LDCT) significantly impedes diagnostic accuracy. Although deep learning techniques, particularly CNNs, have offered promise for LDCT denoising, their inherent focus on local features and the scarcity of extensive training data can limit their performance and ability to generalize effectively. To address these critical shortcomings, we introduce SCRED-Distillation, a novel denoising method. This approach synergistically integrates the global contextual awareness of Transformer architectures with the efficiency and regularization benefits of Knowledge Distillation. By effectively leveraging both local and global image characteristics, SCRED-Distillation achieves demonstrably superior denoising results. Furthermore, to enhance the model’s capacity for robust generalization across diverse datasets, we employ a mutual learning framework during training. Extensive quantitative evaluations conducted on the challenging Mayo Clinic LDCT Grand Challenge dataset reveal remarkable improvements in key image quality metrics: the Peak Signal-to-Noise Ratio (PSNR) increased significantly from 29.2489 to 33.2103, the Structural Similarity Index Measure (SSIM) steadily rose from 0.8759 to 0.9132, and the Root Mean Squared Error (RMSE) was effectively reduced from 14.2416 to 8.9377. Notably, SCRED-Distillation effectively suppresses noise artifacts while crucially preserving fine diagnostic details, leading to clearer and more reliable medical images and ultimately facilitating more accurate clinical diagnoses.
format Article
id doaj-art-4bc0ac968b7a41bc9209d63d504092f2
institution Kabale University
issn 2169-3536
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-4bc0ac968b7a41bc9209d63d504092f22025-08-20T03:49:55ZengIEEEIEEE Access2169-35362025-01-011311822411823610.1109/ACCESS.2025.358500111062456SCRED-Distillation: Improving Low-Dose CT Image Quality via Feature Fusion and Mutual LearningYanqing Wang0https://orcid.org/0009-0003-2765-4434Xinru Zhan1Wanquan Liu2https://orcid.org/0000-0003-4910-353XYingying Li3Kexin Guo4https://orcid.org/0000-0002-0459-5027Huafeng Wang5https://orcid.org/0000-0002-8267-672XDepartment of Radiology, Changzhi People’s Hospital, Changzhi, Shanxi, ChinaSchool of Information Technology, North China University of Technology, Beijing, ChinaSchool of Intelligent Systems Engineering, Sun Yat-sen University, Guangzhou, ChinaSchool of Information Technology, North China University of Technology, Beijing, ChinaHangzhou Innovation Institute, Beihang University, Hangzhou, ChinaSchool of Information Technology, North China University of Technology, Beijing, ChinaThe substantial noise inherent in low-dose CT (LDCT) significantly impedes diagnostic accuracy. Although deep learning techniques, particularly CNNs, have offered promise for LDCT denoising, their inherent focus on local features and the scarcity of extensive training data can limit their performance and ability to generalize effectively. To address these critical shortcomings, we introduce SCRED-Distillation, a novel denoising method. This approach synergistically integrates the global contextual awareness of Transformer architectures with the efficiency and regularization benefits of Knowledge Distillation. By effectively leveraging both local and global image characteristics, SCRED-Distillation achieves demonstrably superior denoising results. Furthermore, to enhance the model’s capacity for robust generalization across diverse datasets, we employ a mutual learning framework during training. Extensive quantitative evaluations conducted on the challenging Mayo Clinic LDCT Grand Challenge dataset reveal remarkable improvements in key image quality metrics: the Peak Signal-to-Noise Ratio (PSNR) increased significantly from 29.2489 to 33.2103, the Structural Similarity Index Measure (SSIM) steadily rose from 0.8759 to 0.9132, and the Root Mean Squared Error (RMSE) was effectively reduced from 14.2416 to 8.9377. Notably, SCRED-Distillation effectively suppresses noise artifacts while crucially preserving fine diagnostic details, leading to clearer and more reliable medical images and ultimately facilitating more accurate clinical diagnoses.https://ieeexplore.ieee.org/document/11062456/Image denoisingdeep learningtransformermutual learning
spellingShingle Yanqing Wang
Xinru Zhan
Wanquan Liu
Yingying Li
Kexin Guo
Huafeng Wang
SCRED-Distillation: Improving Low-Dose CT Image Quality via Feature Fusion and Mutual Learning
IEEE Access
Image denoising
deep learning
transformer
mutual learning
title SCRED-Distillation: Improving Low-Dose CT Image Quality via Feature Fusion and Mutual Learning
title_full SCRED-Distillation: Improving Low-Dose CT Image Quality via Feature Fusion and Mutual Learning
title_fullStr SCRED-Distillation: Improving Low-Dose CT Image Quality via Feature Fusion and Mutual Learning
title_full_unstemmed SCRED-Distillation: Improving Low-Dose CT Image Quality via Feature Fusion and Mutual Learning
title_short SCRED-Distillation: Improving Low-Dose CT Image Quality via Feature Fusion and Mutual Learning
title_sort scred distillation improving low dose ct image quality via feature fusion and mutual learning
topic Image denoising
deep learning
transformer
mutual learning
url https://ieeexplore.ieee.org/document/11062456/
work_keys_str_mv AT yanqingwang screddistillationimprovinglowdosectimagequalityviafeaturefusionandmutuallearning
AT xinruzhan screddistillationimprovinglowdosectimagequalityviafeaturefusionandmutuallearning
AT wanquanliu screddistillationimprovinglowdosectimagequalityviafeaturefusionandmutuallearning
AT yingyingli screddistillationimprovinglowdosectimagequalityviafeaturefusionandmutuallearning
AT kexinguo screddistillationimprovinglowdosectimagequalityviafeaturefusionandmutuallearning
AT huafengwang screddistillationimprovinglowdosectimagequalityviafeaturefusionandmutuallearning