An Improved YOLOv8-Based Dense Pedestrian Detection Method with Multi-Scale Fusion and Linear Spatial Attention

To address the issue of missed detection for small-scale occluded pedestrians in dense scenes, this paper proposes an improved YOLOv8 detection algorithm named Dense-YOLOv8. Firstly, to resolve the difficulty of extracting features from small-scale pedestrians in dense environments, a backbone netwo...

Full description

Saved in:
Bibliographic Details
Main Authors: Han Gong, Tian Li, Lijuan Wang, Shucheng Huang, Mingxing Li
Format: Article
Language:English
Published: MDPI AG 2025-05-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/10/5518
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:To address the issue of missed detection for small-scale occluded pedestrians in dense scenes, this paper proposes an improved YOLOv8 detection algorithm named Dense-YOLOv8. Firstly, to resolve the difficulty of extracting features from small-scale pedestrians in dense environments, a backbone network enhanced with deformable convolution and dynamic convolution is adopted to improve feature extraction capabilities. Simultaneously, a multi-scale linear spatial attention module is designed to amplify features of visible parts of occluded pedestrians while suppressing interference from complex backgrounds. Secondly, a small-scale pedestrian detection head is introduced in the neck of the YOLOv8 network to enhance detection performance for diminutive pedestrians. Finally, to improve training efficiency, a novel weighted loss function named DFL-SIoU is developed to accelerate model convergence. Experimental results demonstrate that the proposed algorithm achieves superior performance on two challenging dense pedestrian datasets, CrowdHuman and WiderPerson, significantly enhancing detection capabilities in dense scenarios. Comparative evaluations with other state-of-the-art pedestrian detection models further confirm the strong competitiveness of the proposed model.
ISSN:2076-3417