Rapid eigenpatch utility classifier for image denoising
Abstract Under low-illumination conditions, images inevitably contain both Poisson and Gaussian noise. In electron microscopy, there is the added complication whereby increasing the dose-rate, to improve signal-to-noise, damages the specimen being imaged, making certain materials being impossible to...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-05-01
|
| Series: | Scientific Reports |
| Online Access: | https://doi.org/10.1038/s41598-025-96859-x |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract Under low-illumination conditions, images inevitably contain both Poisson and Gaussian noise. In electron microscopy, there is the added complication whereby increasing the dose-rate, to improve signal-to-noise, damages the specimen being imaged, making certain materials being impossible to characterise. Conventional data smoothing techniques may dampen usable image contrast, and deep-neural network (DNN) based approaches risk the introduction of artefacts. In this work, the complementary strengths of patch-based and DNN approaches are combined into a lightweight denoising architecture such that experimental data integrity is preserved while effectively removing noise. Our approach, the Rapid Eigenpatch Utility Classifier for Image Denoising (REUCID), leverages the speed and data-integrity of a non-local patch-based SVD step to identify key image components, followed by a convolutional neural network (CNN) acting strictly in a classification capacity on the SVD eigenvectors. This classification-only approach to DNN integration represents a significant advance by mitigating the risk of DNN overreach while maintaining denoising effectiveness. We demonstrate superior performance on high angle annular dark field images, where our hybrid method outperforms conventional techniques in enhancing image contrast while preserving genuine structural features. |
|---|---|
| ISSN: | 2045-2322 |