Research on pedestrian detection method based on multispectral intermediate fusion using YOLOv7

Abstract This study is based on the YOLOv7 object detection framework and conducts comparative experiments on early fusion, halfway fusion, and late fusion for multispectral pedestrian detection tasks. Traditional pedestrian detection tasks typically use image data from a single sensor or modality....

Full description

Saved in:
Bibliographic Details
Main Authors: Bo Jiang, Jingyu Wang, Guoyin Ren, Mobin Zhi, Zhijie Yu, Yang Zhang, Pengju Ren, Shidong Jia
Format: Article
Language:English
Published: Nature Portfolio 2025-05-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-025-88871-y
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract This study is based on the YOLOv7 object detection framework and conducts comparative experiments on early fusion, halfway fusion, and late fusion for multispectral pedestrian detection tasks. Traditional pedestrian detection tasks typically use image data from a single sensor or modality. However, in the field of multispectral remote sensing, fusing multi-source data is crucial for improving detection performance. This study aims to explore the impact of different fusion strategies on multispectral object detection performance and identify the most suitable fusion approach for multispectral data. Firstly, we implemented early fusion experiments by merging multispectral data with visible light data at the network’s input layer. Next, halfway fusion experiments were conducted, merging multispectral data and visible light data at the network’s middle layers. Finally, late fusion experiments were performed by merging multispectral data and visible light data at the network’s high layers. A comprehensive comparison of the experimental results for various fusion strategies reveals that the halfway fusion strategy exhibits outstanding performance in multispectral pedestrian detection tasks, achieving high detection accuracy and relatively fast speed.
ISSN:2045-2322