Research of Pedestrian Detection Methods with Anchor Frame Based on Deep Learning

Pedestrian detection technology has been one of the hotspots for target detection tasks. Deep learning-based theories and techniques perform exceptionally well in the pedestrian detection domain, and several general-purpose target detectors are continuously utilized in the target detection domain. D...

Full description

Saved in:
Bibliographic Details
Main Author: Yan Tao
Format: Article
Language:English
Published: EDP Sciences 2025-01-01
Series:ITM Web of Conferences
Online Access:https://www.itm-conferences.org/articles/itmconf/pdf/2025/04/itmconf_iwadi2024_02021.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Pedestrian detection technology has been one of the hotspots for target detection tasks. Deep learning-based theories and techniques perform exceptionally well in the pedestrian detection domain, and several general-purpose target detectors are continuously utilized in the target detection domain. Deep learning-based pedestrian detection techniques are examined in this research, and anchor frame-based pedestrian detection techniques are separated, contrasted, and examined in light of the anchor frames. The occlusion problem and the scale change problem are one of the main causes of omission and false detection problems in practical applications of pedestrian detection. This paper first presents the conventional pedestrian detection techniques, then concentrates on the R-CNN detector in the anchor frame-based two-stage pedestrian detection algorithm. When addressing the scale change problem, the enhanced Faster R-CNN algorithm, which is based on R-CNN, greatly minimizes redundant computation and enhances recognition accuracy. The YOLOv3 model in the YOLO model exhibits notable modifications in the overall architecture of pedestrian identification in the single-stage pedestrian recognition method using anchor frames, greatly improving its capacity to handle scale changes and occlusion issues.
ISSN:2271-2097