LightGBM-Based Human Action Recognition Using Sensors
In recent years, research on human activity recognition (HAR) on smartphones has received extensive attention due to its portability. However, the discrimination issues between similar activities such as leaning forward and walking forward, as well as going up and down stairs, are hard to deal with....
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-06-01
|
| Series: | Sensors |
| Subjects: | |
| Online Access: | https://www.mdpi.com/1424-8220/25/12/3704 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | In recent years, research on human activity recognition (HAR) on smartphones has received extensive attention due to its portability. However, the discrimination issues between similar activities such as leaning forward and walking forward, as well as going up and down stairs, are hard to deal with. This paper conducts HAR based on the sensors of smartphones, i.e., accelerometers and gyroscopes. First, a feature extraction method for sensor data from both the time domain and frequency domain is designed to obtain more than 300 features, aiming to enhance the accuracy and stability of recognition. Then, the LightGBM (version 4.5.0) algorithm is utilized to comprehensively analyze the above-mentioned extracted features, with the goal of improving the accuracy of similar activity recognition. Through simulation experiments, it is demonstrated that the feature extraction method proposed in this paper has improved the accuracy of HAR. Compared with classical machine learning algorithms such as random forest (version 1.5.2) and XGBoost (version 2.1.3), the LightGBM algorithm shows improved performance in terms of the accuracy rate, which reaches 94.98%. Moreover, after searching for the model parameters using grid search, the prediction accuracy of LightGBM can be increased to 95.35%. Finally, using feature selection and dimensionality reduction, the efficiency of the model is further improved, achieving a 70.14% increase in time efficiency without reducing the accuracy rate. |
|---|---|
| ISSN: | 1424-8220 |