Reach&Grasp: a multimodal dataset of the whole upper-limb during simple and complex movements
Abstract Upper-limb movement characterization is crucial for many applications, from research on motor control, to the extraction of relevant features for driving active prostheses. While this is usually performed using electrophysiological and/or kinematic measurements only, the collection of tacti...
Saved in:
Main Authors: | , , , , , , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2025-02-01
|
Series: | Scientific Data |
Online Access: | https://doi.org/10.1038/s41597-025-04552-5 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Abstract Upper-limb movement characterization is crucial for many applications, from research on motor control, to the extraction of relevant features for driving active prostheses. While this is usually performed using electrophysiological and/or kinematic measurements only, the collection of tactile data during grasping movements could enrich the overall information about interaction with external environment. We provide a dataset collected from 10 healthy volunteers performing 16 tasks, including simple movements (i.e., hand opening/closing, wrist pronation/supination and flexion/extension, tridigital grasping, thumb abduction, cylindrical and spherical grasping) and more complex ones (i.e., reaching and grasping). The novelty consists in the inclusion of several types of recordings, namely electromyographic -both with bipolar and high-density configuration, kinematic-both with motion capture system and a sensorized glove, and tactile. The data is organized following the Brain Imaging Data Structure standard format and have been validated to ensure its reliability. It can be used to investigate upper-limb movements in physiological conditions, and to test sensor fusion approaches and control algorithms for prosthetics and robotic applications. |
---|---|
ISSN: | 2052-4463 |