Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

People flow statistical method based on spatio-temporal context

A spatiotemporal context and statistical method technology, applied in the field of people flow statistics based on spatiotemporal context, can solve the problems of poor robustness, low accuracy, slow processing speed, etc., and achieve high accuracy, fast processing speed, and good invariance. Effect

Inactive Publication Date: 2014-02-12
HUAZHONG UNIV OF SCI & TECH
View PDF5 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the defects of the prior art, the purpose of the present invention is to provide a method for counting people flow based on the constraints of time and space context, which aims to solve the problems of low accuracy, slow processing speed and poor robustness in the existing methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • People flow statistical method based on spatio-temporal context
  • People flow statistical method based on spatio-temporal context
  • People flow statistical method based on spatio-temporal context

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] In order to make the purpose, technical solution and advantages of the present invention more clear, the present invention will be further described in detail below in conjunction with the accompanying drawings and examples. It should be understood that the specific examples described here are only used to explain the present invention, not to limit the present invention.

[0043] The terms used in the present invention are firstly explained and described below.

[0044] HOG: Histograms Of Oriented Gradient, which is the histogram descriptor of the direction gradient. Its idea is that in an image, the appearance and shape of the local target can be well described by the gradient or the direction density distribution of the edge. The specific implementation method is: first divide the image into small connected areas, which we call cell units. Then the gradient or edge direction histogram of each pixel in the cell unit is collected. Finally, these histograms are combi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a people flow statistical method based on spatio-temporal context. The method comprises the steps that for each frame of image in a gray sequence G, an expanded moving edge is extracted according to a sobel algorithm and an inter-frame difference method, head target detection based on HOG features is conducted on the portion of each expanded moving edge, and therefore an initial detection target queue head_list is obtained; according to spatial constraint, false targets are deleted from the queue head_list, and gray mutual-relation matching is conducted on the targets in the queue head_list and final targets detected from all the frames before; targets in a statistical queue people_list are tracked; target positions in the statistical queue people_list are updated; the targets in the statistical queue people_list are counted. According to the people flow statistical method based on the spatio-temporal context, due to the fact that temporal information and spatial information are added, the high detection efficiency can be guaranteed, the number of the false targets is effectively reduced, the accuracy is high, real-time video processing can be conducted, the good invariance property can be kept even when geometric deformation and optical deformation of the images are generated. In this way, the people flow statistical method based on the spatio-temporal context is good in robustness.

Description

technical field [0001] The invention belongs to the technical field of pattern recognition, and more specifically relates to a method for counting people flow based on spatio-temporal context. Background technique [0002] With the rapid development of the economy, it is particularly important to count the pedestrian or passenger flow. First of all, it is an important market research method, and it is a link that almost all large shopping malls and chain commercial outlets in foreign countries must carry out before making market and management decisions. In addition, the people counting system can be applied to places such as airports or subways at the same time, and the number of pedestrians extracted can provide an important basis for formulating corresponding strategies. [0003] Target recognition and target tracking involved in people flow statistics have always been hot spots in the field of pattern recognition. The existing methods for extracting moving targets are a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
Inventor 桑农沙芳华王岳环黄锐胡静高常鑫彭章祥陈张一
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products