MM-BiFPN: Multi-Modality Fusion Network With Bi-FPN for MRI Brain Tumor Segmentation

For medical imaging tasks, it is a prevalent practice to have a multi-modality image dataset, as experts prefer using multiple medical devices to diagnose a disease. Each device can show different aspects of segmentation, which in our case, is magnetic resonance imaging (MRI) brain tumor segmentatio...

Full description

Saved in:
Bibliographic Details
Main Authors: Nur Suriza Syazwany, Ju-Hyeon Nam, Sang-Chul Lee
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9632555/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1846116424619130880
author Nur Suriza Syazwany
Ju-Hyeon Nam
Sang-Chul Lee
author_facet Nur Suriza Syazwany
Ju-Hyeon Nam
Sang-Chul Lee
author_sort Nur Suriza Syazwany
collection DOAJ
description For medical imaging tasks, it is a prevalent practice to have a multi-modality image dataset, as experts prefer using multiple medical devices to diagnose a disease. Each device can show different aspects of segmentation, which in our case, is magnetic resonance imaging (MRI) brain tumor segmentation. For such medical imaging tasks, researchers tend to combine all modalities as an input into the network for feature extraction, and neglect the complexity between different modalities. It is no longer novel to use an encoder-decoder-based model and residual connections to transfer information from high-resolution maps to lower-resolution maps in medical segmentation tasks. In this work, we propose a multimodal fusion network with bi-directional feature pyramid network (MM-BiFPN) using an individual encoder to extract the features of each of the four modalities (FLAIR, T1-weighted, T1-c, and T2-weighted) to focus on the exploitation of the complex relationships among the modalities. In addition, by using the bi-directional feature pyramid network (Bi-FPN) layer, we focus on the aggregation of multiple modalities to study the cross-modality relationship and multi-scale features. Our experiment was conducted on the brain segmentation challenge datasets, the MICCAI BraTS2018 and MICCAI BraTS2020 datasets. We also implemented two ablation studies on our model with different cross-scale modalities fusion networks, as well as a study on different modality settings to see the effect each modality brings in detecting tumor content. With missing modalities, our method achieves a comparable result, demonstrating that our method is robust for brain tumor segmentation.
format Article
id doaj-art-d9e2585fd65a4d179cfa75184cd59486
institution Kabale University
issn 2169-3536
language English
publishDate 2021-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-d9e2585fd65a4d179cfa75184cd594862024-12-19T00:00:28ZengIEEEIEEE Access2169-35362021-01-01916070816072010.1109/ACCESS.2021.31320509632555MM-BiFPN: Multi-Modality Fusion Network With Bi-FPN for MRI Brain Tumor SegmentationNur Suriza Syazwany0https://orcid.org/0000-0001-8073-7974Ju-Hyeon Nam1Sang-Chul Lee2https://orcid.org/0000-0002-6973-2416Department of Computer Science and Engineering, Inha University, Incheon, South KoreaDepartment of Computer Science and Engineering, Inha University, Incheon, South KoreaDepartment of Computer Science and Engineering, Inha University, Incheon, South KoreaFor medical imaging tasks, it is a prevalent practice to have a multi-modality image dataset, as experts prefer using multiple medical devices to diagnose a disease. Each device can show different aspects of segmentation, which in our case, is magnetic resonance imaging (MRI) brain tumor segmentation. For such medical imaging tasks, researchers tend to combine all modalities as an input into the network for feature extraction, and neglect the complexity between different modalities. It is no longer novel to use an encoder-decoder-based model and residual connections to transfer information from high-resolution maps to lower-resolution maps in medical segmentation tasks. In this work, we propose a multimodal fusion network with bi-directional feature pyramid network (MM-BiFPN) using an individual encoder to extract the features of each of the four modalities (FLAIR, T1-weighted, T1-c, and T2-weighted) to focus on the exploitation of the complex relationships among the modalities. In addition, by using the bi-directional feature pyramid network (Bi-FPN) layer, we focus on the aggregation of multiple modalities to study the cross-modality relationship and multi-scale features. Our experiment was conducted on the brain segmentation challenge datasets, the MICCAI BraTS2018 and MICCAI BraTS2020 datasets. We also implemented two ablation studies on our model with different cross-scale modalities fusion networks, as well as a study on different modality settings to see the effect each modality brings in detecting tumor content. With missing modalities, our method achieves a comparable result, demonstrating that our method is robust for brain tumor segmentation.https://ieeexplore.ieee.org/document/9632555/Bi-FPNbrain tumorMRImultimodality fusion
spellingShingle Nur Suriza Syazwany
Ju-Hyeon Nam
Sang-Chul Lee
MM-BiFPN: Multi-Modality Fusion Network With Bi-FPN for MRI Brain Tumor Segmentation
IEEE Access
Bi-FPN
brain tumor
MRI
multimodality fusion
title MM-BiFPN: Multi-Modality Fusion Network With Bi-FPN for MRI Brain Tumor Segmentation
title_full MM-BiFPN: Multi-Modality Fusion Network With Bi-FPN for MRI Brain Tumor Segmentation
title_fullStr MM-BiFPN: Multi-Modality Fusion Network With Bi-FPN for MRI Brain Tumor Segmentation
title_full_unstemmed MM-BiFPN: Multi-Modality Fusion Network With Bi-FPN for MRI Brain Tumor Segmentation
title_short MM-BiFPN: Multi-Modality Fusion Network With Bi-FPN for MRI Brain Tumor Segmentation
title_sort mm bifpn multi modality fusion network with bi fpn for mri brain tumor segmentation
topic Bi-FPN
brain tumor
MRI
multimodality fusion
url https://ieeexplore.ieee.org/document/9632555/
work_keys_str_mv AT nursurizasyazwany mmbifpnmultimodalityfusionnetworkwithbifpnformribraintumorsegmentation
AT juhyeonnam mmbifpnmultimodalityfusionnetworkwithbifpnformribraintumorsegmentation
AT sangchullee mmbifpnmultimodalityfusionnetworkwithbifpnformribraintumorsegmentation