Nested U-Net-Based GAN Model for Super-Resolution of Stained Light Microscopy Images
The purpose of this study was to propose a deep learning-based model for the super-resolution reconstruction of stained light microscopy images. To achieve this, perceptual loss was applied to the generator to reflect multichannel signal intensity, distribution, and structural similarity. A nested U...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-07-01
|
| Series: | Photonics |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2304-6732/12/7/665 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849732777536126976 |
|---|---|
| author | Seong-Hyeon Kang Ji-Youn Kim |
| author_facet | Seong-Hyeon Kang Ji-Youn Kim |
| author_sort | Seong-Hyeon Kang |
| collection | DOAJ |
| description | The purpose of this study was to propose a deep learning-based model for the super-resolution reconstruction of stained light microscopy images. To achieve this, perceptual loss was applied to the generator to reflect multichannel signal intensity, distribution, and structural similarity. A nested U-Net architecture was employed to address the representational limitations of the conventional U-Net. For quantitative evaluation, the peak signal-to-noise ratio (PSNR), structural similarity index (SSIM), and correlation coefficient (CC) were calculated. In addition, intensity profile analysis was performed to assess the model’s ability to restore the boundary signals more precisely. The experimental results demonstrated that the proposed model outperformed both the signal and structural restoration compared to single U-Net and U-Net-based generative adversarial network (GAN) models. Consequently, the PSNR, SSIM, and CC values demonstrated relative improvements of approximately 1.017, 1.023, and 1.010 times, respectively, compared to the input images. In particular, the intensity profile analysis confirmed the effectiveness of the nested U-Net-based generator in restoring cellular boundaries and structures in the stained microscopy images. In conclusion, the proposed model effectively enhanced the resolution of stained light microscopy images acquired in a multichannel format. |
| format | Article |
| id | doaj-art-8e4d9bccaa1f49059065a8c4f3d828e4 |
| institution | DOAJ |
| issn | 2304-6732 |
| language | English |
| publishDate | 2025-07-01 |
| publisher | MDPI AG |
| record_format | Article |
| series | Photonics |
| spelling | doaj-art-8e4d9bccaa1f49059065a8c4f3d828e42025-08-20T03:08:13ZengMDPI AGPhotonics2304-67322025-07-0112766510.3390/photonics12070665Nested U-Net-Based GAN Model for Super-Resolution of Stained Light Microscopy ImagesSeong-Hyeon Kang0Ji-Youn Kim1Department of Radiological Science, Gachon University, Incheon 21936, Republic of KoreaDepartment of Dental Hygiene, Gachon University, Incheon 21936, Republic of KoreaThe purpose of this study was to propose a deep learning-based model for the super-resolution reconstruction of stained light microscopy images. To achieve this, perceptual loss was applied to the generator to reflect multichannel signal intensity, distribution, and structural similarity. A nested U-Net architecture was employed to address the representational limitations of the conventional U-Net. For quantitative evaluation, the peak signal-to-noise ratio (PSNR), structural similarity index (SSIM), and correlation coefficient (CC) were calculated. In addition, intensity profile analysis was performed to assess the model’s ability to restore the boundary signals more precisely. The experimental results demonstrated that the proposed model outperformed both the signal and structural restoration compared to single U-Net and U-Net-based generative adversarial network (GAN) models. Consequently, the PSNR, SSIM, and CC values demonstrated relative improvements of approximately 1.017, 1.023, and 1.010 times, respectively, compared to the input images. In particular, the intensity profile analysis confirmed the effectiveness of the nested U-Net-based generator in restoring cellular boundaries and structures in the stained microscopy images. In conclusion, the proposed model effectively enhanced the resolution of stained light microscopy images acquired in a multichannel format.https://www.mdpi.com/2304-6732/12/7/665stained light microscopysuper-resolutionmultichannel image reconstructiongenerative adversarial networknested U-Net |
| spellingShingle | Seong-Hyeon Kang Ji-Youn Kim Nested U-Net-Based GAN Model for Super-Resolution of Stained Light Microscopy Images Photonics stained light microscopy super-resolution multichannel image reconstruction generative adversarial network nested U-Net |
| title | Nested U-Net-Based GAN Model for Super-Resolution of Stained Light Microscopy Images |
| title_full | Nested U-Net-Based GAN Model for Super-Resolution of Stained Light Microscopy Images |
| title_fullStr | Nested U-Net-Based GAN Model for Super-Resolution of Stained Light Microscopy Images |
| title_full_unstemmed | Nested U-Net-Based GAN Model for Super-Resolution of Stained Light Microscopy Images |
| title_short | Nested U-Net-Based GAN Model for Super-Resolution of Stained Light Microscopy Images |
| title_sort | nested u net based gan model for super resolution of stained light microscopy images |
| topic | stained light microscopy super-resolution multichannel image reconstruction generative adversarial network nested U-Net |
| url | https://www.mdpi.com/2304-6732/12/7/665 |
| work_keys_str_mv | AT seonghyeonkang nestedunetbasedganmodelforsuperresolutionofstainedlightmicroscopyimages AT jiyounkim nestedunetbasedganmodelforsuperresolutionofstainedlightmicroscopyimages |