Traffic sign detection method based on improved YOLOv8
Abstract Traffic sign detection is crucial in intelligent transportation and assisted driving, providing favourable support for driving safety and prevention of traffic accidents. Aiming at the current traffic sign detection problems of leakage, misdetection and low detection accuracy of small targe...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-06-01
|
| Series: | Scientific Reports |
| Subjects: | |
| Online Access: | https://doi.org/10.1038/s41598-025-03792-0 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract Traffic sign detection is crucial in intelligent transportation and assisted driving, providing favourable support for driving safety and prevention of traffic accidents. Aiming at the current traffic sign detection problems of leakage, misdetection and low detection accuracy of small targets, a traffic sign detection method based on improved YOLOv8n is proposed. Firstly, the Neck part of YOLOv8 is improved by designing a module that combines Attention Scale Sequence Fusion with the P2 small target detection layer (AFP) to enhance the feature extraction capability of the YOLOv8 network, enabling it to capture more small target features. Secondly, a lightweight convolution module, LWConv, is designed, based on which the Bottleneck structure of Cross-convolution with two filters (C2f) in YOLOv8 is reconstructed and named LW_C2f, effectively reducing the model size and parameters. Finally, the loss function of the original YOLOv8 is replaced with the Wise-IoU loss function, which improves the network’s bounding box regression performance and reduces the negative impact of low-quality samples. The experimental results show that the mean average precision (mAP50) of the improved model on the TT100K dataset is increased by 5.7% compared to the YOLOv8 model, while the number of parameters and the model size are reduced by 0.6 M and 0.8 MB, respectively. |
|---|---|
| ISSN: | 2045-2322 |