A Lightweight Direction-Aware Network for Vehicle Detection
Vehicle detection algorithms, which are essential to intelligent traffic management and control systems, have attracted growing attention. However, most high-precision vehicle detection algorithms suffer from high computational effort and slow detection speeds, resulting in the challenging task of d...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10877820/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Vehicle detection algorithms, which are essential to intelligent traffic management and control systems, have attracted growing attention. However, most high-precision vehicle detection algorithms suffer from high computational effort and slow detection speeds, resulting in the challenging task of deploying these algorithms on mobile devices. In this paper, we propose a lightweight direction-aware network (LDAN) based on the YOLOv8 for vehicle detection on mobile devices. First, a lightweight C2f-GSP module is proposed to optimize the backbone network, which enhances the interaction of local features and fully extracts vehicle information. Then, a triple efficient coordinate attention mechanism (TECA) is designed. The mechanism can fully perceive the details and salient information of input features in multiple directions, thus improving the ability of the model to capture critical features. Moreover, to further reduce model parameters and computational requirements, a lightweight shared convolutional detection head (SCL-Head) is devised using a parameter-sharing mechanism. Finally, experimental results on the KITTI dataset show that the proposed method not only reduces resource consumption but also improves the accuracy of vehicle detection, which provides a novel technical path to realize real-time and accurate vehicle detection on mobile devices. |
|---|---|
| ISSN: | 2169-3536 |