MDEU-Net: Medical Image Segmentation Network Based on Multi-Head Multi-Scale Cross-Axis

Significant advances have been made in the application of attention mechanisms to medical image segmentation, and these advances are notably driven by the development of the cross-axis attention mechanism. However, challenges remain in handling complex images, particularly in multi-scale feature ext...

Full description

Saved in:
Bibliographic Details
Main Authors: Shengxian Yan, Yuyang Lei, Jing Zhang, Xiao Gao, Xiang Li, Penghui Wang, Hui Cao
Format: Article
Language:English
Published: MDPI AG 2025-05-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/9/2917
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Significant advances have been made in the application of attention mechanisms to medical image segmentation, and these advances are notably driven by the development of the cross-axis attention mechanism. However, challenges remain in handling complex images, particularly in multi-scale feature extraction and fine-detail capture. To address these limitations, this paper presents a novel network architecture, multi-head multi-scale cross-axis attention MDEU-Net, that leverages a multi-head attention mechanism processing input features in parallel. The proposed architecture enables the model to focus on both local and global information while capturing features at various spatial scales. Additionally, a gated attention mechanism facilitates efficient feature fusion by selectively emphasizing key features rather than relying on simple concatenation and improves the model’s ability to capture critical details at multiple scales. Furthermore, the incorporation of residual connections further mitigates the gradient vanishing problem by enhancing the model’s capacity to capture complex structures and fine details. This approach accelerates computation and enhances processing efficiency, while experimental results demonstrate that the proposed network outperforms traditional architectures in terms of performance.
ISSN:1424-8220