A Crowd Counting Framework Combining with Crowd Location

In the past ten years, crowd detection and counting have been applied in many fields such as station crowd statistics, urban safety prevention, and people flow statistics. However, obtaining accurate positions and improving the performance of crowd counting in dense scenes still face challenges, and...

Full description

Saved in:
Bibliographic Details
Main Authors: Jin Zhang, Sheng Chen, Sen Tian, Wenan Gong, Guoshan Cai, Ying Wang
Format: Article
Language:English
Published: Wiley 2021-01-01
Series:Journal of Advanced Transportation
Online Access:http://dx.doi.org/10.1155/2021/6664281
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In the past ten years, crowd detection and counting have been applied in many fields such as station crowd statistics, urban safety prevention, and people flow statistics. However, obtaining accurate positions and improving the performance of crowd counting in dense scenes still face challenges, and it is worthwhile devoting much effort to this. In this paper, a new framework is proposed to resolve the problem. The proposed framework includes two parts. The first part is a fully convolutional neural network (CNN) consisting of backend and upsampling. In the first part, backend uses the residual network (ResNet) to encode the features of the input picture, and upsampling uses the deconvolution layer to decode the feature information. The first part processes the input image, and the processed image is input to the second part. The second part is a peak confidence map (PCM), which is proposed based on an improvement over the density map (DM). Compared with DM, PCM can not only solve the problem of crowd counting but also accurately predict the location of the person. The experimental results on several datasets (Beijing-BRT, Mall, Shanghai Tech, and UCF_CC_50 datasets) show that the proposed framework can achieve higher crowd counting performance in dense scenarios and can accurately predict the location of crowds.
ISSN:0197-6729
2042-3195