FDGSNet: A Multimodal Gated Segmentation Network for Remote Sensing Image Based on Frequency Decomposition

Multiple modal data fusion can provide valuable and diverse information for remote sensing image segmentation. However, the existing fusion methods often lead to feature loss during the fusion of various modal data, and the complementarity among multimodal features is insufficient. To address these...

Full description

Saved in:
Bibliographic Details
Main Authors: Jian Cui, Jiahang Liu, Yue Ni, Jinjin Wang, Manchun Li
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10700993/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multiple modal data fusion can provide valuable and diverse information for remote sensing image segmentation. However, the existing fusion methods often lead to feature loss during the fusion of various modal data, and the complementarity among multimodal features is insufficient. To address these problems, we propose a multimodal gated segmentation network for remote sensing images based on the frequency decomposition. Complementary information from multimodal features is extracted by establishing a long-distance correlation between the low-frequency components of different modal data. In addition, high-frequency detailed features of different modal data are preserved by residual connection. The adaptive gated fusion method is then used to control the information flow between the complementary information and each modality feature map, enabling adaptive fusion between multimodal features. These operations can effectively improve the adaptability of the proposed method in various scenarios and data changes. Extensive experiments demonstrate that the proposed method has good effectiveness, robustness, and generalization and achieved state-of-the-art performance in several remote sensing image semantic segmentation tasks.
ISSN:1939-1404
2151-1535