Multimodal intelligent biosensors framework for fall disease detection and healthcare monitoring
IntroductionIn the field of human action recognition, the fusion of multi-modal data from RGB and inertial modalities provides a valid technique for identifying activities of daily life and falls.MethodsOur approach uses two reference datasets: UR-Fall Detection and UMA_Fall Detection for ADL and Fa...
Saved in:
| Main Authors: | , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Frontiers Media S.A.
2025-06-01
|
| Series: | Frontiers in Bioengineering and Biotechnology |
| Subjects: | |
| Online Access: | https://www.frontiersin.org/articles/10.3389/fbioe.2025.1544968/full |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | IntroductionIn the field of human action recognition, the fusion of multi-modal data from RGB and inertial modalities provides a valid technique for identifying activities of daily life and falls.MethodsOur approach uses two reference datasets: UR-Fall Detection and UMA_Fall Detection for ADL and Fall Events. First, data preprocessing is conducted for each sort of sensor individually, then the signals are windowed and segmented properly. Key features are then extracted, where from RGB data we get 2.5D point clouds, kinetic energy, angles, curve points, ridge features, and inertial signals, giving GCC, GMM, LPCC, and SSCE coefficients. The second method employed is Adam to improve the discriminant of the chosen features. For classification, we employed a Deep Neural Network (DNN) for ADL and fall detection over the UR-Fall dataset and the UMA_Fall dataset.ResultsThe classification accuracy achieved on the UMA_Fall dataset is 97% for ADL activities and 96% for fall activities, while for the UR-Fall dataset, it is 94% for ADL activities and 92% for fall activities. This diversified classifier setting compensates for the variety of data and optimizes the system for differentiating between ADL and fall events.DiscussionThe above system provides outstanding results in recognizing these activities on both datasets and illustrates that the multimodal data fusion can boost the human activity identification system for health and safety purposes. |
|---|---|
| ISSN: | 2296-4185 |