TDA SegUNet: Topological Data Analysis-Based Shape-Aware Brain Tumor Segmentation

Brain tumors are a pressing concern for the medical community. Advanced MRI techniques are essential for the accurate detection and segmentation of tumors and their sub-regions, especially the enhancing tumor. However, due to the complex structure of the human brain, brain tumor segmentation remains...

Full description

Saved in:
Bibliographic Details
Main Authors: Ansar Rahman, Ayesha Satti, Ahmad Raza Shahid, Qaisar M. Shafi, Khadija Farooq, Asad Ali Safi
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10891573/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Brain tumors are a pressing concern for the medical community. Advanced MRI techniques are essential for the accurate detection and segmentation of tumors and their sub-regions, especially the enhancing tumor. However, due to the complex structure of the human brain, brain tumor segmentation remains a challenging task. This study introduces a novel model, 2D Topological Data Analysis based segmentation model backed by UNet referred to as TDA-SegUNet, designed for the precise segmentation of brain tumors and their sub-regions. TDA-SegUNet is a U-Net-based segmentation model that integrates topological data analysis (TDA) to extract shape-based local and global features from MRI scans. For this purpose, we constructed 0 and 1-dimensional homology-based persistence images (PIs) that capture local and global features, respectively. The BraTS20 MRI CCA-Patch dataset, along with corresponding persistence images (PIs) constructed using TDA, was utilized for model evaluation. The proposed model was trained for 2D segmentation on the BraTS20 CCA-Patch dataset and demonstrated improved performance compared to state-of-the-art techniques in segmenting the whole tumor, core tumor, and enhancing tumor regions, achieving validation Dice scores of 90.37%, 89.56%, and 81.33%, respectively.
ISSN:2169-3536