PFW-YOLO Lightweight Helmet Detection Algorithm
Helmet recognition, as an important means to ensure personnel safety in high-risk operating environments, requires the deployment of recognition models to edge-end devices to achieve real-time and portability. However, due to the limited arithmetic and storage resources of edge-end devices, and the...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10965682/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Helmet recognition, as an important means to ensure personnel safety in high-risk operating environments, requires the deployment of recognition models to edge-end devices to achieve real-time and portability. However, due to the limited arithmetic and storage resources of edge-end devices, and the traditional detection algorithms have problems such as the number of parameters and large computational volume, the detection algorithms are difficult to be deployed practically. Therefore, a lightweight helmet detection algorithm PFW YOLO is proposed in this paper. Firstly, a multi-scale feature fusion module is designed to reconstruct the Bottleneck structure in C2f, which finally forms the C2f-PMSFF module to enhance the feature expression ability of the model and optimize the computational efficiency. Second, in order to further reduce the size of the model while ensuring the detection accuracy, Feature Interaction Shared Detection Head (FISH) is introduced. Finally, Wise-Inner-Shape-IoU is used to optimize the bounding box regression loss function, which is used to enhance the detection accuracy and accelerate the convergence speed. The final experimental results indicate that, compared to the original YOLOv8n algorithm, the PFW-YOLO algorithm achieves a 53% reduction in parameters, a 42% decrease in computational effort, and a 51% reduction in model size, while enhancing the mean average precision (mAP) by 0.4%. |
|---|---|
| ISSN: | 2169-3536 |