LightGBM-Based Human Action Recognition Using Sensors

In recent years, research on human activity recognition (HAR) on smartphones has received extensive attention due to its portability. However, the discrimination issues between similar activities such as leaning forward and walking forward, as well as going up and down stairs, are hard to deal with....

Full description

Saved in:
Bibliographic Details
Main Authors: Yinuo Liu, Ziwei Chen
Format: Article
Language:English
Published: MDPI AG 2025-06-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/12/3704
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850165027750805504
author Yinuo Liu
Ziwei Chen
author_facet Yinuo Liu
Ziwei Chen
author_sort Yinuo Liu
collection DOAJ
description In recent years, research on human activity recognition (HAR) on smartphones has received extensive attention due to its portability. However, the discrimination issues between similar activities such as leaning forward and walking forward, as well as going up and down stairs, are hard to deal with. This paper conducts HAR based on the sensors of smartphones, i.e., accelerometers and gyroscopes. First, a feature extraction method for sensor data from both the time domain and frequency domain is designed to obtain more than 300 features, aiming to enhance the accuracy and stability of recognition. Then, the LightGBM (version 4.5.0) algorithm is utilized to comprehensively analyze the above-mentioned extracted features, with the goal of improving the accuracy of similar activity recognition. Through simulation experiments, it is demonstrated that the feature extraction method proposed in this paper has improved the accuracy of HAR. Compared with classical machine learning algorithms such as random forest (version 1.5.2) and XGBoost (version 2.1.3), the LightGBM algorithm shows improved performance in terms of the accuracy rate, which reaches 94.98%. Moreover, after searching for the model parameters using grid search, the prediction accuracy of LightGBM can be increased to 95.35%. Finally, using feature selection and dimensionality reduction, the efficiency of the model is further improved, achieving a 70.14% increase in time efficiency without reducing the accuracy rate.
format Article
id doaj-art-2805ce0833a645d38d979ead4edc522a
institution OA Journals
issn 1424-8220
language English
publishDate 2025-06-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj-art-2805ce0833a645d38d979ead4edc522a2025-08-20T02:21:50ZengMDPI AGSensors1424-82202025-06-012512370410.3390/s25123704LightGBM-Based Human Action Recognition Using SensorsYinuo Liu0Ziwei Chen1Department of Computer Science and Technology, College of Information Engineering, Northwest A&F University, Yangling, Xianyang 712100, ChinaDepartment of Electronics, Beijing Jiaotong University, Beijing 100044, ChinaIn recent years, research on human activity recognition (HAR) on smartphones has received extensive attention due to its portability. However, the discrimination issues between similar activities such as leaning forward and walking forward, as well as going up and down stairs, are hard to deal with. This paper conducts HAR based on the sensors of smartphones, i.e., accelerometers and gyroscopes. First, a feature extraction method for sensor data from both the time domain and frequency domain is designed to obtain more than 300 features, aiming to enhance the accuracy and stability of recognition. Then, the LightGBM (version 4.5.0) algorithm is utilized to comprehensively analyze the above-mentioned extracted features, with the goal of improving the accuracy of similar activity recognition. Through simulation experiments, it is demonstrated that the feature extraction method proposed in this paper has improved the accuracy of HAR. Compared with classical machine learning algorithms such as random forest (version 1.5.2) and XGBoost (version 2.1.3), the LightGBM algorithm shows improved performance in terms of the accuracy rate, which reaches 94.98%. Moreover, after searching for the model parameters using grid search, the prediction accuracy of LightGBM can be increased to 95.35%. Finally, using feature selection and dimensionality reduction, the efficiency of the model is further improved, achieving a 70.14% increase in time efficiency without reducing the accuracy rate.https://www.mdpi.com/1424-8220/25/12/3704human activity recognitionfeature selectionLightGBM
spellingShingle Yinuo Liu
Ziwei Chen
LightGBM-Based Human Action Recognition Using Sensors
Sensors
human activity recognition
feature selection
LightGBM
title LightGBM-Based Human Action Recognition Using Sensors
title_full LightGBM-Based Human Action Recognition Using Sensors
title_fullStr LightGBM-Based Human Action Recognition Using Sensors
title_full_unstemmed LightGBM-Based Human Action Recognition Using Sensors
title_short LightGBM-Based Human Action Recognition Using Sensors
title_sort lightgbm based human action recognition using sensors
topic human activity recognition
feature selection
LightGBM
url https://www.mdpi.com/1424-8220/25/12/3704
work_keys_str_mv AT yinuoliu lightgbmbasedhumanactionrecognitionusingsensors
AT ziweichen lightgbmbasedhumanactionrecognitionusingsensors