IPN HandS: Efficient Annotation Tool and Dataset for Skeleton-Based Hand Gesture Recognition

Hand gesture recognition (HGR) heavily relies on high-quality annotated datasets. However, annotating hand landmarks in video sequences is a time-intensive challenge. In this work, we introduce IPN HandS, an enhanced version of our IPN Hand dataset, which now includes approximately 700,000 hand skel...

Full description

Saved in:
Bibliographic Details
Main Authors: Gibran Benitez-Garcia, Jesus Olivares-Mercado, Gabriel Sanchez-Perez, Hiroki Takahashi
Format: Article
Language:English
Published: MDPI AG 2025-06-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/11/6321
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Hand gesture recognition (HGR) heavily relies on high-quality annotated datasets. However, annotating hand landmarks in video sequences is a time-intensive challenge. In this work, we introduce IPN HandS, an enhanced version of our IPN Hand dataset, which now includes approximately 700,000 hand skeleton annotations and corrected gesture boundaries. To generate these annotations efficiently, we propose a novel annotation tool that combines automatic detection, inter-frame interpolation, copy–paste capabilities, and manual refinement. This tool significantly reduces annotation time from 70 min to just 27 min per video, allowing for the scalable and precise annotation of large datasets. We validate the advantages of the IPN HandS dataset by training a lightweight LSTM-based model using these annotations and comparing its performance against models trained with annotations from the widely used MediaPipe hand pose estimators. Our model achieves an accuracy that is 12% higher than the MediaPipe Hands model and 8% higher than the MediaPipe Holistic model. These results underscore the importance of annotation quality in training generalization and overall recognition performance. Both the IPN HandS dataset and the annotation tool will be released to support reproducible research and future work in HGR and related fields.
ISSN:2076-3417