EDGE-BASED VIDEO RECOGNITION: ADVANCING DEEP LEARNING FOR EFFICIENT VISUAL
The escalating prevalence of video surveillance underscores the critical necessity for precise and efficient Human Action Recognition (HAR) systems. In this study, we introduce an innovative deep learning architecture tailored for real-time HAR at the edge. Our approach involves monitoring video str...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
University of Kragujevac
2025-03-01
|
| Series: | Proceedings on Engineering Sciences |
| Subjects: | |
| Online Access: | https://pesjournal.net/journal/v7-n1/62.pdf |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | The escalating prevalence of video surveillance underscores the critical necessity for precise and efficient Human Action Recognition (HAR) systems. In this study, we introduce an innovative deep learning architecture tailored for real-time HAR at the edge. Our approach involves monitoring video streams, identifying specific events, and logging their timestamps using a compact Recurrent Neural Network (RNN). We benchmark the performance of our RNN against well-established deep learning models like Convolutional Neural Networks (CNNs) and Deep Neural Networks (DNNs). Notably, we integrate YOLOv8 to enhance the model's capability in recognizing human actions within designated frames. Additionally, we employ GAN for image denoising to improve feature selection. Thorough evaluations validate our system's exceptional F1 score, accuracy, and precision, establishing its utility for edge-based HAR applications. The model achieves remarkable accuracy rates across various action identification tasks, notably scoring 97.36% accuracy for jumping, 98.72% for strolling, 98.09% for running, and 98.26% for hand gestures. Moreover, it demonstrates high precision rates for walking, running, jumping, and hand motions (99.57%, 98.09%, 97.36%, and 99.67% respectively), highlighting its exceptional performance and reliability across a spectrum of human behaviors. |
|---|---|
| ISSN: | 2620-2832 2683-4111 |