ECgMLP: A novel gated MLP model for enhanced endometrial cancer diagnosis

Endometrial cancеr is the fourth fastеst-growing cancеr among women worldwide, affecting the uterus's lining. This research proposes a novel approach called ECgMLP for the automated diagnosis of endometrial cancer by analyzing histopathological images. Several preprocessing techniques are emplo...

Full description

Saved in:
Bibliographic Details
Main Authors: Md. Alif Sheakh, Sami Azam, Mst. Sazia Tahosin, Asif Karim, Sidratul Montaha, Kayes Uddin Fahim, Niusha Shafiabady, Mirjam Jonkman, Friso De Boer
Format: Article
Language:English
Published: Elsevier 2025-01-01
Series:Computer Methods and Programs in Biomedicine Update
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2666990025000059
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Endometrial cancеr is the fourth fastеst-growing cancеr among women worldwide, affecting the uterus's lining. This research proposes a novel approach called ECgMLP for the automated diagnosis of endometrial cancer by analyzing histopathological images. Several preprocessing techniques are employed to increase the quality of the images, including normalization, Non-Local Means denoising, and alpha-beta enhancement. Effective segmentation is achieved through a combination of Otsu thresholding, morphological operations, distance transformations, and the watershed approach to identify major regions of interest. Through a sequence of blocks, the ECgMLP architecture processes input images to remove unimportant patterns. Model hyperparameters are improved via ablation research. The evaluations show a maximum accuracy of 99.26 % for identifying multi-class histopathological categories of endometrial tissue, which is higher than the previous best technique. The proposed model offers an automated, correct diagnosis, enhancing clinical processes. This proposition could be added to the current tools for finding endometrial cancer early, leading to better patient outcomes.
ISSN:2666-9900