Intelligent biosensors for human movement rehabilitation and intention recognition

IntroductionAdvancements in sensing technologies have enabled the integration of inertial sensors, such as accelerometers and gyroscopes, into everyday devices like smartphones and wearables. These sensors, initially intended to enhance device functionality, are now pivotal in applications such as H...

Full description

Saved in:
Bibliographic Details
Main Authors: Mehrab Rafiq, Nouf Abdullah Almujally, Asaad Algarni, Mohammed Alshehri, Yahya AlQahtani, Ahmad Jalal, Hui Liu
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-07-01
Series:Frontiers in Bioengineering and Biotechnology
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fbioe.2025.1558529/full
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:IntroductionAdvancements in sensing technologies have enabled the integration of inertial sensors, such as accelerometers and gyroscopes, into everyday devices like smartphones and wearables. These sensors, initially intended to enhance device functionality, are now pivotal in applications such as Human Locomotion Recognition (HLR), with relevance in sports, healthcare, rehabilitation, and context-aware systems. This study presents a robust system for accurately recognizing human movement and localization characteristics using sensor data.MethodsTwo datasets were used: the Extrasensory dataset and the KU-HAR dataset. The Extrasensory dataset includes multimodal sensor data (IMU, GPS, and audio) from 60 participants, while the KU-HAR dataset provides accelerometer and gyroscope data from 90 participants performing 18 distinct activities. Raw sensor signals were first denoised using a second-order Butterworth filter, and segmentation was performed using Hamming windows. Feature extraction included Skewness, Energy, Kurtosis, Linear Prediction Cepstral Coefficients (LPCC), and Dynamic Time Warping (DTW) for locomotion, as well as Step Count and Step Length for localization. Yeo-Johnson power transformation was employed to optimize the extracted features.ResultsThe proposed system achieved 90% accuracy on the Extrasensory dataset and 91% on the KU-HAR dataset. These results surpass the performance of several existing state-of-the-art methods. Statistical analysis and additional testing confirmed the robustness and generalization capabilities of the model across both datasets.DiscussionThe developed system demonstrates strong performance in recognizing human locomotion and localization across different sensor environments, even when dealing with noisy data. Its effectiveness in real-world scenarios highlights its potential for integration into healthcare monitoring, physical rehabilitation, and intelligent wearable systems. The model's scalability and high accuracy support its applicability for deployment on embedded platforms in future implementations.
ISSN:2296-4185