Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-camera combined large-scene crowd counting method

A crowd counting and large-scene technology, applied in the field of computer vision, can solve problems such as multi-cameras in large-scale scenes, overlapping monitoring areas, and complex distribution of monitoring cameras.

Inactive Publication Date: 2021-04-27
BEIHANG UNIV
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] With the increasing number of surveillance cameras in modern society, the spatial relationship between cameras has become more and more complex. It is a big challenge for surveillance personnel to accurately count the number of pedestrians in the entire scene in complex and overlapping surveillance video data. challenge
At present, the crowd gathering behavior in the entire monitoring scene still lacks effective management and control, and the slow flow of people due to crowd congestion may cause serious group incidents
On the other hand, due to the increase in the number of surveillance cameras and the expansion of coverage, the traditional manual inspection and analysis of surveillance content requires a lot of human resources, which is time-consuming, laborious, inefficient, and cannot be analyzed in real time.
[0003] At the same time, due to the complex distribution of surveillance cameras in the real scene, there are overlapping phenomena in the surveillance area. It is difficult to obtain the accurate number of pedestrians in the entire scene by simply importing images from a single surveillance camera and simply counting the number of people in each camera.
Therefore, the existing methods are difficult to meet the needs of large scenes with multiple cameras, and the tasks related to crowd counting under large scenes with multiple cameras still face great challenges.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-camera combined large-scene crowd counting method
  • Multi-camera combined large-scene crowd counting method
  • Multi-camera combined large-scene crowd counting method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0022] Such as figure 1 As shown, a multi-camera combined large-scene crowd counting method provided by an embodiment of the present invention includes the following steps:

[0023] Step S1: Obtain the base map and monitoring image of the observed scene, calculate the homography matrix corresponding to the base map and the monitoring image, map the monitoring image to the base image, and construct a scene model;

[0024] Step S2: Preprocessing the monitoring image, and obtaining the information of the head point bounding box in the monitoring image through the processing of the crowd counting neural network, and mapping the head point bounding box to the base image;

[0025] Step S3: According to the non-maximum value suppression algorithm, filter the bounding boxes of human head points in the base image, delete repeated bounding boxes of human head points, and calculate the total number of bounding boxes of human head points as the total number of people in the observation sc...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-camera combined large-scene crowd counting method, and the method comprises the following steps: S1, obtaining a base map and a monitoring image of an observation scene, obtaining a homography matrix corresponding to the base map and the monitoring image, mapping the monitoring image to the base map, and constructing a scene model; s2, preprocessing the monitoring image, obtaining information of all human head point bounding boxes in the monitoring image through processing of a crowd counting neural network, and mapping the human head point bounding boxes to a base map; and S3, according to a non-maximum suppression algorithm, screening the human head point bounding boxes in the base map, deleting the repeated human head point bounding boxes, and calculating the total number of the human head point bounding boxes to serve as the total number of people in the observation scene. The method provided by the invention has good adaptability to the crowd counting application situation of multiple cameras in a large scene, and can obtain the accurate crowd number in the scene in real time.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and in particular relates to a multi-camera joint large-scene crowd counting method. Background technique [0002] With the increasing number of surveillance cameras in modern society, the spatial relationship between cameras has become more and more complex. It is a big challenge for surveillance personnel to accurately count the number of pedestrians in the entire scene in complex and overlapping surveillance video data. challenge. At present, the crowd gathering behavior in the entire monitoring scene still lacks effective management and control, and the slow flow of people due to crowd congestion may cause serious group incidents. On the other hand, due to the increase in the number of surveillance cameras and the expansion of coverage, the traditional manual inspection and analysis of surveillance content requires a lot of human resources, which is time-consuming, laborious, ineffic...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/04G06N3/08G06V20/46G06V20/53
Inventor 周忠张鑫高松楚程翔杨元哲
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products