Gesture-controlled omnidirectional autonomous vehicle: A web-based approach for gesture recognition

This paper presents a novel web-based hand/thumb gesture recognition model, validated through the implementation of a gesture-controlled omnidirectional autonomous vehicle. Utilizing a custom-trained YOLOv5s model, the system efficiently translates user gestures into precise control signals, facilit...

Full description

Saved in:
Bibliographic Details
Main Authors: Huma Zia, Bara Fteiha, Maha Abdulnasser, Tafleh Saleh, Fatima Suliemn, Kawther Alagha, Jawad Yousaf, Mohammed Ghazal
Format: Article
Language:English
Published: Elsevier 2025-07-01
Series:Array
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2590005625000359
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper presents a novel web-based hand/thumb gesture recognition model, validated through the implementation of a gesture-controlled omnidirectional autonomous vehicle. Utilizing a custom-trained YOLOv5s model, the system efficiently translates user gestures into precise control signals, facilitating real-time vehicle operation under five commands: forward, backward, left, right, and stop. Integration with Raspberry Pi hardware, including a camera and peripherals, enables rapid live video processing with a latency of 150–300 ms and stable frame rates of 12–18 FPS. The system demonstrates reliable performance with a classification accuracy of 94.2%, validated across multiple gesture classes through statistical analysis, including confusion matrices and ANOVA testing. A user-friendly web interface, built using TensorFlow.js, Node.js, and WebSocket, enhances usability by providing seamless live video streaming and real-time, device-agnostic control directly in the browser without requiring wearable sensors or external processing. The system’s key contributions include: (1) robust real-time hand gesture recognition using YOLOv5s; (2) seamless Raspberry Pi–Arduino integration; (3) a browser-based interface enabling accessible, scalable deployment; and (4) empirical validation across functional, environmental, and statistical performance metrics. This innovation marks a significant advancement in the practical application of hand gesture control within robotics. It offers a flexible and cost-effective alternative to sensor-based systems and serves as a foundation for future developments in autonomous vehicles, human-machine interaction, assistive technologies, automation, and AI-driven interfaces. By eliminating the existing systems’ need for wearable technology, specialized hardware, or complex setups, this work expands the potential for deploying intuitive, touch-free control systems across diverse real-world domains.
ISSN:2590-0056