Automated detection of hard exudates in retinal fundus images for diabetic retinopathy screening using textural-based radon transform and morphology reconstruction

Background: Diabetic retinopathy (DR) screening requires accurate detection of hard exudates (HEs) in retinal images. This study presents a novel method that integrates textural-based Radon transform (RT) with morphological image processing techniques to automate the detection and segmentation of HE...

Full description

Saved in:
Bibliographic Details
Main Authors: Esmat Ramezanzadeh, Naser Shoeibi, Akram Feizabadi, Touka Banaee, Mohammad Hossein Bahreyni Tussi, Meysam Tavakoli
Format: Article
Language:English
Published: Elsevier 2025-06-01
Series:Biomedical Engineering Advances
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2667099225000362
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Background: Diabetic retinopathy (DR) screening requires accurate detection of hard exudates (HEs) in retinal images. This study presents a novel method that integrates textural-based Radon transform (RT) with morphological image processing techniques to automate the detection and segmentation of HEs in color fundus photography (CFP) images. By enhancing the diagnostic capabilities for DR, this approach aims to provide ophthalmologists with a reliable and efficient tool for identifying early signs of vascular damage associated with diabetes. Method: The proposed algorithm was evaluated on two datasets, DIARETDB1 (89 images) and MUMS-DB (32 images). We developed an automated method for detecting HEs in CFP images. The approach involves a comprehensive framework comprising preprocessing, main processing, feature extraction, and post-processing. Key techniques include Radon transform for optical disc, vessels and soft and hard exudate feature extraction, and morphological reconstruction for enhancing detection accuracy. We employed Kirsch edge detection to distinguish HEs based on edge sharpness and utilized Top-Hat transformation to highlight small-scale features. The method integrates clinical expertise with computational techniques to differentiate between morphologically similar lesions. Performance was assessed through classification and pixel-based classification. metrics. Results: The proposed algorithm demonstrated high performance in pixel-based classification, achieving best sensitivity of 92 %, and specificity 100 %. In lesion-based classification, the model achieved 100 % sensitivity and 100 % specificity on MUMS-DB datasets in the best case. Conclusion: This integrated methodology successfully addresses the challenging task of differentiating between morphologically similar lesions, representing a significant advancement in automated DR screening. While performance varied between datasets, the results demonstrate strong potential for clinical application.
ISSN:2667-0992