RehabHand—A New Physical Rehabilitation Training Dataset: Construction and Benchmark Performances of the Relevant Hand Tasks

Physical training for a person with an impaired hand generally requires extensive guidance and supervision from rehabilitation therapists. Currently, the advantages of wearable sensing technologies and the breakthrough performances of deep neural networks offer feasible solutions for developing assi...

Full description

Saved in:
Bibliographic Details
Main Authors: Sinh Huy Nguyen, Hoang Bach Nguyen, Thi Thu Hong Le, Chi Thanh Nguyen, Thi Thanh Huyen Nguyen, Quynh Tho Chu, Thi Lan Le, Thanh Hai Tran, Hai Vu
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11029240/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Physical training for a person with an impaired hand generally requires extensive guidance and supervision from rehabilitation therapists. Currently, the advantages of wearable sensing technologies and the breakthrough performances of deep neural networks offer feasible solutions for developing assistive tools, particularly to automatically monitor the functional use of the patient&#x2019;s hand. These applications help to report the results of a rehabilitation training, such as the interaction time between hands and objects and the number of interactions during each practice session. This reduces the burden on physiotherapists and occupational therapists and is beneficial for related research, such as rehabilitation assessment and cognitive sciences. However, the lack of datasets, particularly a dedicated dataset for rehabilitation training/exercises that are collected by body-worn sensors, can block these potential advantages. In this paper, we construct such a multimodal dataset and present a comprehensive study of relevant hand tasks using state-of-the-art deep neuronal models. The dataset is collected in a series of rehabilitation exercises of ten patients who are in treatment after injuries or strokes. The constructed dataset includes different wearable modalities such as cameras, accelerometers, and gyroscopes. Table-based exercises for rehabilitation of patients are designed with ball, water bottles, wooden blocks, and cylindrical blocks. In total, 56 exercise videos with more than 54 GB of data are collected. These videos are segmented into 431 sequences that present instances of each exercise at a certain time. To evaluate the performance of recent neuronal networks, we extract 4500 still images to label eight classes including left- and right-hand, and other objects involved in the exercises. In addition, labeling the patient&#x2019;s hand in a consecutive frames of 40 shots is implemented with approximately 29K images annotated. These labeled datasets are utilized to evaluate three primary tasks related to evaluating physical hand functions: including one- and two-stage neuronal networks for hand detection and identifying left or right patient&#x2019;s hand; hand tracking with SORT and DeepSORT algorithms. We report a thorough performance analysis of the neuronal network models for the constructed dataset. These results are also thoroughly discussed in terms of technical issues and challenges in developing a robust clinical tool. The dataset and benchmark performances are available to download online: <uri>http://mica.edu.vn:50211/rehabhand/</uri>
ISSN:2169-3536