Foreground and Background Interaction Fusion Network for Camouflaged Object Detection

Aiming at the problem of incomplete detection results and blurred edge details in current camouflaged object detection (COD) methods, a novel Foreground and Background Interactive Fusion Network ( FBIFNet) was proposed to further improve the performance of COD through joint exploration of foreground...

Full description

Saved in:
Bibliographic Details
Main Authors: WEI Mingjun, LIU Ming, LIU Yazhi, LI Hui
Format: Article
Language:zho
Published: Harbin University of Science and Technology Publications 2025-04-01
Series:Journal of Harbin University of Science and Technology
Subjects:
Online Access:https://hlgxb.hrbust.edu.cn/#/digest?ArticleID=2413
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Aiming at the problem of incomplete detection results and blurred edge details in current camouflaged object detection (COD) methods, a novel Foreground and Background Interactive Fusion Network ( FBIFNet) was proposed to further improve the performance of COD through joint exploration of foreground and background regions. FBIFNet contains a key Bilateral Interactive Fusion module (BIF), which uses a pair of complementary attentions to guide the network to jointly reason about camouflaged objects from both foreground and background directions and also utilizes an interaction strategy based on the bidirectional attention mechanism and a weighted fusion strategy to learn complementary information between foreground and background. In addition, an Attentional Cascaded Positioning module (ACP) is included, which can localize camouflaged objects from a global perspective and provide more accurate foreground and background guidance for BIF. With the two proposed modules, FBIFNet can more accurately detect camouflaged objects. Extensive experiments on three public datasets ( CAMO, COD10K, and NC4K) demonstrate that the proposed network outperforms state-of-the-art methods in related fields on four evaluation metrics.
ISSN:1007-2683