Vahagn: VisuAl Haptic Attention Gate Net for slip detection

IntroductionSlip detection is crucial for achieving stable grasping and subsequent operational tasks. A grasp action is a continuous process that requires information from multiple sources. The success of a specific grasping maneuver is contingent upon the confluence of two factors: the spatial accu...

Full description

Saved in:
Bibliographic Details
Main Authors: Jinlin Wang, Yulong Ji, Hongyu Yang
Format: Article
Language:English
Published: Frontiers Media S.A. 2024-11-01
Series:Frontiers in Neurorobotics
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnbot.2024.1484751/full
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850199873525121024
author Jinlin Wang
Yulong Ji
Hongyu Yang
author_facet Jinlin Wang
Yulong Ji
Hongyu Yang
author_sort Jinlin Wang
collection DOAJ
description IntroductionSlip detection is crucial for achieving stable grasping and subsequent operational tasks. A grasp action is a continuous process that requires information from multiple sources. The success of a specific grasping maneuver is contingent upon the confluence of two factors: the spatial accuracy of the contact and the stability of the continuous process.MethodsIn this paper, for the task of perceiving grasping results using visual-haptic information, we propose a new method for slip detection, which synergizes visual and haptic information from spatial-temporal dual dimensions. Specifically, the method takes as input a sequence of visual images from a first-person perspective and a sequence of haptic images from a gripper. Then, it extracts time-dependent features of the whole process and spatial features matching the importance of different parts with different attention mechanisms. Inspired by neurological studies, during the information fusion process, we adjusted temporal and spatial information from vision and haptic through a combination of two-step fusion and gate units.Results and discussionTo validate the effectiveness of method, we compared it with traditional CNN net and models with attention. It is anticipated that our method achieves a classification accuracy of 93.59%, which is higher than that of previous works. Attention visualization is further presented to support the validity.
format Article
id doaj-art-0d3d07fdd65e48b8bc4c5107533602bd
institution OA Journals
issn 1662-5218
language English
publishDate 2024-11-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Neurorobotics
spelling doaj-art-0d3d07fdd65e48b8bc4c5107533602bd2025-08-20T02:12:30ZengFrontiers Media S.A.Frontiers in Neurorobotics1662-52182024-11-011810.3389/fnbot.2024.14847511484751Vahagn: VisuAl Haptic Attention Gate Net for slip detectionJinlin Wang0Yulong Ji1Hongyu Yang2National Key Laboratory of Fundamental Science on Synthetic Vision, Sichuan University, Chengdu, ChinaSchool of Aeronautics and Astronautics, Sichuan University, Chengdu, ChinaNational Key Laboratory of Fundamental Science on Synthetic Vision, Sichuan University, Chengdu, ChinaIntroductionSlip detection is crucial for achieving stable grasping and subsequent operational tasks. A grasp action is a continuous process that requires information from multiple sources. The success of a specific grasping maneuver is contingent upon the confluence of two factors: the spatial accuracy of the contact and the stability of the continuous process.MethodsIn this paper, for the task of perceiving grasping results using visual-haptic information, we propose a new method for slip detection, which synergizes visual and haptic information from spatial-temporal dual dimensions. Specifically, the method takes as input a sequence of visual images from a first-person perspective and a sequence of haptic images from a gripper. Then, it extracts time-dependent features of the whole process and spatial features matching the importance of different parts with different attention mechanisms. Inspired by neurological studies, during the information fusion process, we adjusted temporal and spatial information from vision and haptic through a combination of two-step fusion and gate units.Results and discussionTo validate the effectiveness of method, we compared it with traditional CNN net and models with attention. It is anticipated that our method achieves a classification accuracy of 93.59%, which is higher than that of previous works. Attention visualization is further presented to support the validity.https://www.frontiersin.org/articles/10.3389/fnbot.2024.1484751/fullmultimodal perceptionmultimodal deep learningattention mechanismhapticrobot perception
spellingShingle Jinlin Wang
Yulong Ji
Hongyu Yang
Vahagn: VisuAl Haptic Attention Gate Net for slip detection
Frontiers in Neurorobotics
multimodal perception
multimodal deep learning
attention mechanism
haptic
robot perception
title Vahagn: VisuAl Haptic Attention Gate Net for slip detection
title_full Vahagn: VisuAl Haptic Attention Gate Net for slip detection
title_fullStr Vahagn: VisuAl Haptic Attention Gate Net for slip detection
title_full_unstemmed Vahagn: VisuAl Haptic Attention Gate Net for slip detection
title_short Vahagn: VisuAl Haptic Attention Gate Net for slip detection
title_sort vahagn visual haptic attention gate net for slip detection
topic multimodal perception
multimodal deep learning
attention mechanism
haptic
robot perception
url https://www.frontiersin.org/articles/10.3389/fnbot.2024.1484751/full
work_keys_str_mv AT jinlinwang vahagnvisualhapticattentiongatenetforslipdetection
AT yulongji vahagnvisualhapticattentiongatenetforslipdetection
AT hongyuyang vahagnvisualhapticattentiongatenetforslipdetection