Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Method of People Flow Statistics Based on Spatial-Temporal Context

A technology of spatio-temporal context and statistical method, which is applied in the field of people flow statistics based on spatio-temporal context, can solve the problems of low accuracy, slow processing speed, and poor robustness, and achieve high accuracy, fast processing speed, and high detection rate Effect

Inactive Publication Date: 2016-05-25
HUAZHONG UNIV OF SCI & TECH
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the defects of the prior art, the purpose of the present invention is to provide a method for counting people flow based on the constraints of time and space context, which aims to solve the problems of low accuracy, slow processing speed and poor robustness in the existing methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Method of People Flow Statistics Based on Spatial-Temporal Context
  • A Method of People Flow Statistics Based on Spatial-Temporal Context
  • A Method of People Flow Statistics Based on Spatial-Temporal Context

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] In order to make the purpose, technical solution and advantages of the present invention more clear, the present invention will be further described in detail below in conjunction with the accompanying drawings and examples. It should be understood that the specific examples described here are only used to explain the present invention, not to limit the present invention.

[0043] The terms used in the present invention are firstly explained and described below.

[0044] HOG: HistogramsOfOrientedGradient, which is the histogram descriptor of oriented gradients. Its idea is: in an image, the appearance and shape (appearance and shape) of the local target can be well described by the direction density distribution of the gradient or edge. The specific implementation method is: first divide the image into small connected areas, which we call cell units. Then the gradient or edge direction histogram of each pixel in the cell unit is collected. Finally, these histograms a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a people flow statistical method based on spatio-temporal context. The method comprises the steps that for each frame of image in a gray sequence G, an expanded moving edge is extracted according to a sobel algorithm and an inter-frame difference method, head target detection based on HOG features is conducted on the portion of each expanded moving edge, and therefore an initial detection target queue head_list is obtained; according to spatial constraint, false targets are deleted from the queue head_list, and gray mutual-relation matching is conducted on the targets in the queue head_list and final targets detected from all the frames before; targets in a statistical queue people_list are tracked; target positions in the statistical queue people_list are updated; the targets in the statistical queue people_list are counted. According to the people flow statistical method based on the spatio-temporal context, due to the fact that temporal information and spatial information are added, the high detection efficiency can be guaranteed, the number of the false targets is effectively reduced, the accuracy is high, real-time video processing can be conducted, the good invariance property can be kept even when geometric deformation and optical deformation of the images are generated. In this way, the people flow statistical method based on the spatio-temporal context is good in robustness.

Description

technical field [0001] The invention belongs to the technical field of pattern recognition, and more specifically relates to a method for counting people flow based on spatio-temporal context. Background technique [0002] With the rapid development of the economy, it is particularly important to count the pedestrian or passenger flow. First of all, it is an important market research method, and it is a link that almost all large shopping malls and chain commercial outlets in foreign countries must carry out before making market and management decisions. In addition, the people counting system can be applied to places such as airports or subways at the same time, and the number of pedestrians extracted can provide an important basis for formulating corresponding strategies. [0003] Target recognition and target tracking involved in people flow statistics have always been hot spots in the field of pattern recognition. The existing methods for extracting moving targets are a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62
Inventor 桑农沙芳华王岳环黄锐胡静高常鑫彭章祥陈张一
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products