Efficient and Explainable Human Activity Recognition Using Deep Residual Network with Squeeze-and-Excitation Mechanism
Wearable sensors for human activity recognition (HAR) have gained significant attention across multiple domains, such as personal health monitoring and intelligent home systems. Despite notable advancements in deep learning for HAR, understanding the decision-making process of complex models remains...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-04-01
|
| Series: | Applied System Innovation |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2571-5577/8/3/57 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850157001799106560 |
|---|---|
| author | Sakorn Mekruksavanich Anuchit Jitpattanakul |
| author_facet | Sakorn Mekruksavanich Anuchit Jitpattanakul |
| author_sort | Sakorn Mekruksavanich |
| collection | DOAJ |
| description | Wearable sensors for human activity recognition (HAR) have gained significant attention across multiple domains, such as personal health monitoring and intelligent home systems. Despite notable advancements in deep learning for HAR, understanding the decision-making process of complex models remains challenging. This study introduces an advanced deep residual network integrated with a squeeze-and-excitation (SE) mechanism to improve recognition accuracy and model interpretability. The proposed model, ConvResBiGRU-SE, was tested using the UCI-HAR and WISDM datasets. It achieved remarkable accuracies of 99.18% and 98.78%, respectively, surpassing existing state-of-the-art methods. The SE mechanism enhanced the model’s ability to focus on essential features, while gradient-weighted class activation mapping (Grad-CAM) increased interpretability by highlighting essential sensory data influencing predictions. Additionally, ablation experiments validated the contribution of each component to the model’s overall performance. This research advances HAR technology by offering a more transparent and efficient recognition system. The enhanced transparency and predictive accuracy may increase user trust and facilitate smoother integration into real-world applications. |
| format | Article |
| id | doaj-art-cfb778137b4345c394704cffb69cf0f3 |
| institution | OA Journals |
| issn | 2571-5577 |
| language | English |
| publishDate | 2025-04-01 |
| publisher | MDPI AG |
| record_format | Article |
| series | Applied System Innovation |
| spelling | doaj-art-cfb778137b4345c394704cffb69cf0f32025-08-20T02:24:18ZengMDPI AGApplied System Innovation2571-55772025-04-01835710.3390/asi8030057Efficient and Explainable Human Activity Recognition Using Deep Residual Network with Squeeze-and-Excitation MechanismSakorn Mekruksavanich0Anuchit Jitpattanakul1Department of Computer Engineering, School of Information and Communication Technology, University of Phayao, Phayao 56000, ThailandDepartment of Mathematics, Faculty of Applied Science, King Mongkut’s University of Technology North Bangkok, Bangkok 10800, ThailandWearable sensors for human activity recognition (HAR) have gained significant attention across multiple domains, such as personal health monitoring and intelligent home systems. Despite notable advancements in deep learning for HAR, understanding the decision-making process of complex models remains challenging. This study introduces an advanced deep residual network integrated with a squeeze-and-excitation (SE) mechanism to improve recognition accuracy and model interpretability. The proposed model, ConvResBiGRU-SE, was tested using the UCI-HAR and WISDM datasets. It achieved remarkable accuracies of 99.18% and 98.78%, respectively, surpassing existing state-of-the-art methods. The SE mechanism enhanced the model’s ability to focus on essential features, while gradient-weighted class activation mapping (Grad-CAM) increased interpretability by highlighting essential sensory data influencing predictions. Additionally, ablation experiments validated the contribution of each component to the model’s overall performance. This research advances HAR technology by offering a more transparent and efficient recognition system. The enhanced transparency and predictive accuracy may increase user trust and facilitate smoother integration into real-world applications.https://www.mdpi.com/2571-5577/8/3/57human activity recognition (HAR)explainable AI (XAI)wearable sensorssqueeze-and-excitation mechanismdeep residual network |
| spellingShingle | Sakorn Mekruksavanich Anuchit Jitpattanakul Efficient and Explainable Human Activity Recognition Using Deep Residual Network with Squeeze-and-Excitation Mechanism Applied System Innovation human activity recognition (HAR) explainable AI (XAI) wearable sensors squeeze-and-excitation mechanism deep residual network |
| title | Efficient and Explainable Human Activity Recognition Using Deep Residual Network with Squeeze-and-Excitation Mechanism |
| title_full | Efficient and Explainable Human Activity Recognition Using Deep Residual Network with Squeeze-and-Excitation Mechanism |
| title_fullStr | Efficient and Explainable Human Activity Recognition Using Deep Residual Network with Squeeze-and-Excitation Mechanism |
| title_full_unstemmed | Efficient and Explainable Human Activity Recognition Using Deep Residual Network with Squeeze-and-Excitation Mechanism |
| title_short | Efficient and Explainable Human Activity Recognition Using Deep Residual Network with Squeeze-and-Excitation Mechanism |
| title_sort | efficient and explainable human activity recognition using deep residual network with squeeze and excitation mechanism |
| topic | human activity recognition (HAR) explainable AI (XAI) wearable sensors squeeze-and-excitation mechanism deep residual network |
| url | https://www.mdpi.com/2571-5577/8/3/57 |
| work_keys_str_mv | AT sakornmekruksavanich efficientandexplainablehumanactivityrecognitionusingdeepresidualnetworkwithsqueezeandexcitationmechanism AT anuchitjitpattanakul efficientandexplainablehumanactivityrecognitionusingdeepresidualnetworkwithsqueezeandexcitationmechanism |