Multiscale transformers and multi-attention mechanism networks for pathological nuclei segmentation

Abstract Pathology nuclei segmentation is crucial of computer-aided diagnosis in pathology. However, due to the high density, complex backgrounds, and blurred cell boundaries, it makes pathology cell segmentation still a challenging problem. In this paper, we propose a network model for pathology im...

Full description

Saved in:
Bibliographic Details
Main Authors: Yongzhao Du, Xin Chen, Yuqing Fu
Format: Article
Language:English
Published: Nature Portfolio 2025-04-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-025-90397-2
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Pathology nuclei segmentation is crucial of computer-aided diagnosis in pathology. However, due to the high density, complex backgrounds, and blurred cell boundaries, it makes pathology cell segmentation still a challenging problem. In this paper, we propose a network model for pathology image segmentation based on a multi-scale Transformer multi-attention mechanism. To solve the problem that the high density of cell nuclei and the complexity of the background make it difficult to extract features, a dense attention module is embedded in the encoder, which improves the learning of the target cell information to minimize target information loss; Additionally, to solve the problem of poor segmentation accuracy due to the blurred cell boundaries, the Multi-scale Transformer Attention module is embedded between encoder and decoder, improving the transfer of the boundary feature information and makes the segmented cell boundaries more accurate. Experimental results on MoNuSeg, GlaS and CoNSeP datasets demonstrate the network’s superior accuracy.
ISSN:2045-2322