SoftGrasp: Adaptive grasping for dexterous hand based on multimodal imitation learning

Biomimetic grasping is crucial for robots to interact with the environment and perform complex tasks, making it a key focus in robotics and embodied intelligence. However, achieving human-level finger coordination and force control remains challenging due to the need for multimodal perception, inclu...

Full description

Saved in:
Bibliographic Details
Main Authors: Yihong Li, Ce Guo, Junkai Ren, Bailiang Chen, Chuang Cheng, Hui Zhang, Huimin Lu
Format: Article
Language:English
Published: Elsevier 2025-06-01
Series:Biomimetic Intelligence and Robotics
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2667379725000087
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850043990495199232
author Yihong Li
Ce Guo
Junkai Ren
Bailiang Chen
Chuang Cheng
Hui Zhang
Huimin Lu
author_facet Yihong Li
Ce Guo
Junkai Ren
Bailiang Chen
Chuang Cheng
Hui Zhang
Huimin Lu
author_sort Yihong Li
collection DOAJ
description Biomimetic grasping is crucial for robots to interact with the environment and perform complex tasks, making it a key focus in robotics and embodied intelligence. However, achieving human-level finger coordination and force control remains challenging due to the need for multimodal perception, including visual, kinesthetic, and tactile feedback. Although some recent approaches have demonstrated remarkable performance in grasping diverse objects, they often rely on expensive tactile sensors or are restricted to rigid objects. To address these challenges, we introduce SoftGrasp, a novel multimodal imitation learning approach for adaptive, multi-stage grasping of objects with varying sizes, shapes, and hardness. First, we develop an immersive demonstration platform with force feedback to collect rich, human-like grasping datasets. Inspired by human proprioceptive manipulation, this platform gathers multimodal signals, including visual images, robot finger joint angles, and joint torques, during demonstrations. Next, we utilize a multi-head attention mechanism to align and integrate multimodal features, dynamically allocating attention to ensure comprehensive learning. On this basis, we design a behavior cloning method based on an angle-torque loss function, enabling multimodal imitation learning. Finally, we validate SoftGrasp in extensive experiments across various scenarios, demonstrating its ability to adaptively adjust joint forces and finger angles based on real-time inputs. These capabilities result in a 98% success rate in real-world experiments, achieving dexterous and stable grasping. Source code and demonstration videos are available at https://github.com/nubot-nudt/SoftGrasp.
format Article
id doaj-art-1f08e2347dfb425eb1bb2439b94cb001
institution DOAJ
issn 2667-3797
language English
publishDate 2025-06-01
publisher Elsevier
record_format Article
series Biomimetic Intelligence and Robotics
spelling doaj-art-1f08e2347dfb425eb1bb2439b94cb0012025-08-20T02:55:05ZengElsevierBiomimetic Intelligence and Robotics2667-37972025-06-015210021710.1016/j.birob.2025.100217SoftGrasp: Adaptive grasping for dexterous hand based on multimodal imitation learningYihong Li0Ce Guo1Junkai Ren2Bailiang Chen3Chuang Cheng4Hui Zhang5Huimin Lu6The College of Intelligence Science and Technology, National University of Defense Technology, Changsha 410073, ChinaThe College of Intelligence Science and Technology, National University of Defense Technology, Changsha 410073, ChinaCorresponding authors.; The College of Intelligence Science and Technology, National University of Defense Technology, Changsha 410073, ChinaThe College of Intelligence Science and Technology, National University of Defense Technology, Changsha 410073, ChinaThe College of Intelligence Science and Technology, National University of Defense Technology, Changsha 410073, ChinaCorresponding authors.; The College of Intelligence Science and Technology, National University of Defense Technology, Changsha 410073, ChinaThe College of Intelligence Science and Technology, National University of Defense Technology, Changsha 410073, ChinaBiomimetic grasping is crucial for robots to interact with the environment and perform complex tasks, making it a key focus in robotics and embodied intelligence. However, achieving human-level finger coordination and force control remains challenging due to the need for multimodal perception, including visual, kinesthetic, and tactile feedback. Although some recent approaches have demonstrated remarkable performance in grasping diverse objects, they often rely on expensive tactile sensors or are restricted to rigid objects. To address these challenges, we introduce SoftGrasp, a novel multimodal imitation learning approach for adaptive, multi-stage grasping of objects with varying sizes, shapes, and hardness. First, we develop an immersive demonstration platform with force feedback to collect rich, human-like grasping datasets. Inspired by human proprioceptive manipulation, this platform gathers multimodal signals, including visual images, robot finger joint angles, and joint torques, during demonstrations. Next, we utilize a multi-head attention mechanism to align and integrate multimodal features, dynamically allocating attention to ensure comprehensive learning. On this basis, we design a behavior cloning method based on an angle-torque loss function, enabling multimodal imitation learning. Finally, we validate SoftGrasp in extensive experiments across various scenarios, demonstrating its ability to adaptively adjust joint forces and finger angles based on real-time inputs. These capabilities result in a 98% success rate in real-world experiments, achieving dexterous and stable grasping. Source code and demonstration videos are available at https://github.com/nubot-nudt/SoftGrasp.http://www.sciencedirect.com/science/article/pii/S2667379725000087Adaptive graspingDexterous handMultimodal fusionImitation learning
spellingShingle Yihong Li
Ce Guo
Junkai Ren
Bailiang Chen
Chuang Cheng
Hui Zhang
Huimin Lu
SoftGrasp: Adaptive grasping for dexterous hand based on multimodal imitation learning
Biomimetic Intelligence and Robotics
Adaptive grasping
Dexterous hand
Multimodal fusion
Imitation learning
title SoftGrasp: Adaptive grasping for dexterous hand based on multimodal imitation learning
title_full SoftGrasp: Adaptive grasping for dexterous hand based on multimodal imitation learning
title_fullStr SoftGrasp: Adaptive grasping for dexterous hand based on multimodal imitation learning
title_full_unstemmed SoftGrasp: Adaptive grasping for dexterous hand based on multimodal imitation learning
title_short SoftGrasp: Adaptive grasping for dexterous hand based on multimodal imitation learning
title_sort softgrasp adaptive grasping for dexterous hand based on multimodal imitation learning
topic Adaptive grasping
Dexterous hand
Multimodal fusion
Imitation learning
url http://www.sciencedirect.com/science/article/pii/S2667379725000087
work_keys_str_mv AT yihongli softgraspadaptivegraspingfordexteroushandbasedonmultimodalimitationlearning
AT ceguo softgraspadaptivegraspingfordexteroushandbasedonmultimodalimitationlearning
AT junkairen softgraspadaptivegraspingfordexteroushandbasedonmultimodalimitationlearning
AT bailiangchen softgraspadaptivegraspingfordexteroushandbasedonmultimodalimitationlearning
AT chuangcheng softgraspadaptivegraspingfordexteroushandbasedonmultimodalimitationlearning
AT huizhang softgraspadaptivegraspingfordexteroushandbasedonmultimodalimitationlearning
AT huiminlu softgraspadaptivegraspingfordexteroushandbasedonmultimodalimitationlearning