MG6D: A Deep Fusion Approach for 6D Pose Estimation With Mamba and Graph Convolution Network

Accurate and efficient 6D pose estimation is a fundamental technology in many industrial applications. While existing dense correspondence methods have shown progress, they face challenges in multimodal feature fusion under complex scenarios involving occlusions, illumination variations, and sensor...

Full description

Saved in:
Bibliographic Details
Main Authors: Jiaqi Zhu, Bin Li, Xinhua Zhao
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11021472/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Accurate and efficient 6D pose estimation is a fundamental technology in many industrial applications. While existing dense correspondence methods have shown progress, they face challenges in multimodal feature fusion under complex scenarios involving occlusions, illumination variations, and sensor noise. This paper proposes a novel 6D pose estimation framework that addresses these limitations through a hybrid Mamba-Graph architecture. The algorithm first introduces a panoramic attention fusion Mamba module, leveraging state-space modeling to capture long-range dependencies in multi-modal data while establishing cross-dimensional interactions between channel and spatial features to emphasize critical information. A dynamic graph convolutional adaptive fusion module is then designed to enable cross-modal geometric consistency modeling via multi-modal feature integration. Finally, a texture-geometry co-driven keypoint selection mechanism is proposed to ensure keypoint distributions satisfy both spatial uniformity and discriminability requirements. Experimental results on three common datasets demonstrate that the proposed algorithm achieves ADD(-S) metrics of 99.82%, 80.26%, and 97.2%, respectively. Notably, it exhibits significant advantages in pose estimation for objects with repetitive textures and high symmetry.
ISSN:2169-3536