Lightweight Deep Learning for sEMG-Based Fingers Position Classification and Embedded System Deployment
This paper presents a lightweight deep learning approach for classifying and tracking finger positions based on surface electromyography (sEMG) signals, designed with a perspective of its implementation in embedded systems. Unlike traditional studies that focus on static hand gestures, this research...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10904232/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | This paper presents a lightweight deep learning approach for classifying and tracking finger positions based on surface electromyography (sEMG) signals, designed with a perspective of its implementation in embedded systems. Unlike traditional studies that focus on static hand gestures, this research highlights the continuous tracking of finger positions during dynamic hand movements, such as the transition between open and closed states. A 1D Convolutional Neural Network was developed and validated using experimental data, achieving a classification accuracy of 97% in controlled scenarios. The architecture of the model balances computational efficiency and classification performance, making it deployable on resource-constrained embedded technology. This feature highlights its potential for real-time applications in prosthetics, robotics, and human-computer interaction. Although further optimization is needed for better generalization to unseen data, this study emphasizes the significance of developing deployable algorithms that excel beyond simulation environments focusing on enhancing model robustness and validating its real-time performance through hardware-based implementations. The findings indicate notable advancements in connecting sophisticated machine-learning techniques with effective embedded solutions for complex, dynamic tasks. |
|---|---|
| ISSN: | 2169-3536 |