TuSegNet: A Transformer-Based and Attention-Enhanced Architecture for Brain Tumor Segmentation

Brain tumor segmentation is crucial in medical imaging, allowing informed diagnosis and treatment planning. In this study, we propose TuSegNet, a new transformer-based and attention-enhanced architecture for robust brain tumor segmentation. The model combines convolutional layers with transformer bl...

Full description

Saved in:
Bibliographic Details
Main Authors: Mir Nafiul Nagib, Rahat Pervez, Afsana Alam Nova, Hadiur Rahman Nabil, Zeyar Aung, M. F. Mridha
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Open Journal of the Computer Society
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11002687/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849734700431572992
author Mir Nafiul Nagib
Rahat Pervez
Afsana Alam Nova
Hadiur Rahman Nabil
Zeyar Aung
M. F. Mridha
author_facet Mir Nafiul Nagib
Rahat Pervez
Afsana Alam Nova
Hadiur Rahman Nabil
Zeyar Aung
M. F. Mridha
author_sort Mir Nafiul Nagib
collection DOAJ
description Brain tumor segmentation is crucial in medical imaging, allowing informed diagnosis and treatment planning. In this study, we propose TuSegNet, a new transformer-based and attention-enhanced architecture for robust brain tumor segmentation. The model combines convolutional layers with transformer blocks for global context awareness, incorporates Atrous Spatial Pyramid Pooling (ASPP) for multi-scale feature extraction, and employs channel attention mechanisms to concentrate on tumor-relevant parts. Evaluated on three datasets—Dataset A, Dataset B, and a combined dataset—TuSegNet achieves state-of-the-art performance with a Dice Similarity Coefficient (DSC) of 0.895, 0.910, and 0.930, respectively, and an Intersection over Union (IoU) of 0.820, 0.835, and 0.860. Ablation studies validate the importance of ASPP and attention mechanisms, while comparative analysis demonstrates outstanding performance over existing SOTA models such as Swin UNet and TransUNet. The proposed methodology improves segmentation accuracy and highlights the importance of hybrid architectures in handling complex medical imaging tasks. These developments underscore the potential of TuSegNet for real-world healthcare applications in brain tumor diagnosis.
format Article
id doaj-art-8aacc01bf30c464f9725f426a4d6a019
institution DOAJ
issn 2644-1268
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Open Journal of the Computer Society
spelling doaj-art-8aacc01bf30c464f9725f426a4d6a0192025-08-20T03:07:44ZengIEEEIEEE Open Journal of the Computer Society2644-12682025-01-01675076110.1109/OJCS.2025.356975811002687TuSegNet: A Transformer-Based and Attention-Enhanced Architecture for Brain Tumor SegmentationMir Nafiul Nagib0Rahat Pervez1Afsana Alam Nova2Hadiur Rahman Nabil3https://orcid.org/0009-0005-4311-2875Zeyar Aung4https://orcid.org/0000-0001-5990-9305M. F. Mridha5https://orcid.org/0000-0001-5738-1631Department of Information Technology, Washington University of Science and Technology, Alexandria, VA, USABay Atlantic University, Washington, DC, USADepartment of Information Technology, Washington University of Science and Technology, Alexandria, VA, USADepartment of Computer Science and Engineering, American International University, Dhaka, BangladeshDepartment of Computer Science, Khalifa University, Abu Dhabi, UAEDepartment of Computer Science and Engineering, American International University, Dhaka, BangladeshBrain tumor segmentation is crucial in medical imaging, allowing informed diagnosis and treatment planning. In this study, we propose TuSegNet, a new transformer-based and attention-enhanced architecture for robust brain tumor segmentation. The model combines convolutional layers with transformer blocks for global context awareness, incorporates Atrous Spatial Pyramid Pooling (ASPP) for multi-scale feature extraction, and employs channel attention mechanisms to concentrate on tumor-relevant parts. Evaluated on three datasets—Dataset A, Dataset B, and a combined dataset—TuSegNet achieves state-of-the-art performance with a Dice Similarity Coefficient (DSC) of 0.895, 0.910, and 0.930, respectively, and an Intersection over Union (IoU) of 0.820, 0.835, and 0.860. Ablation studies validate the importance of ASPP and attention mechanisms, while comparative analysis demonstrates outstanding performance over existing SOTA models such as Swin UNet and TransUNet. The proposed methodology improves segmentation accuracy and highlights the importance of hybrid architectures in handling complex medical imaging tasks. These developments underscore the potential of TuSegNet for real-world healthcare applications in brain tumor diagnosis.https://ieeexplore.ieee.org/document/11002687/Brain tumor segmentationtransformer-based architectureattention mechanismsmedical image analysisdeep learningcomputer vision
spellingShingle Mir Nafiul Nagib
Rahat Pervez
Afsana Alam Nova
Hadiur Rahman Nabil
Zeyar Aung
M. F. Mridha
TuSegNet: A Transformer-Based and Attention-Enhanced Architecture for Brain Tumor Segmentation
IEEE Open Journal of the Computer Society
Brain tumor segmentation
transformer-based architecture
attention mechanisms
medical image analysis
deep learning
computer vision
title TuSegNet: A Transformer-Based and Attention-Enhanced Architecture for Brain Tumor Segmentation
title_full TuSegNet: A Transformer-Based and Attention-Enhanced Architecture for Brain Tumor Segmentation
title_fullStr TuSegNet: A Transformer-Based and Attention-Enhanced Architecture for Brain Tumor Segmentation
title_full_unstemmed TuSegNet: A Transformer-Based and Attention-Enhanced Architecture for Brain Tumor Segmentation
title_short TuSegNet: A Transformer-Based and Attention-Enhanced Architecture for Brain Tumor Segmentation
title_sort tusegnet a transformer based and attention enhanced architecture for brain tumor segmentation
topic Brain tumor segmentation
transformer-based architecture
attention mechanisms
medical image analysis
deep learning
computer vision
url https://ieeexplore.ieee.org/document/11002687/
work_keys_str_mv AT mirnafiulnagib tusegnetatransformerbasedandattentionenhancedarchitectureforbraintumorsegmentation
AT rahatpervez tusegnetatransformerbasedandattentionenhancedarchitectureforbraintumorsegmentation
AT afsanaalamnova tusegnetatransformerbasedandattentionenhancedarchitectureforbraintumorsegmentation
AT hadiurrahmannabil tusegnetatransformerbasedandattentionenhancedarchitectureforbraintumorsegmentation
AT zeyaraung tusegnetatransformerbasedandattentionenhancedarchitectureforbraintumorsegmentation
AT mfmridha tusegnetatransformerbasedandattentionenhancedarchitectureforbraintumorsegmentation