Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Crowd counting method based on multi-scale space guide perception aggregation network

A technology of aggregation network and crowd counting, which is applied in the field of computer vision, can solve the problems of weakening the effectiveness of information dissemination, insufficient capture of detailed features, and high model complexity, so as to improve the messy target distribution, improve the capture ability, and improve the expression ability. Effect

Pending Publication Date: 2022-07-01
HANGZHOU DIANZI UNIV +1
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, capturing detailed features is not enough and suffers from underperception in this task
[0006] 3. Complex background interference is also an important factor affecting the accuracy of crowd counting. The previous model solved the above problems through self-attention mechanism or image segmentation, but the complexity of the model is high and the calculation cost is relatively high.
[0007] 4. In feature fusion, simple fusion will weaken the effectiveness of information dissemination. In the field of crowd counting, especially in the field of spatial location, advanced semantic information and background interference.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Crowd counting method based on multi-scale space guide perception aggregation network
  • Crowd counting method based on multi-scale space guide perception aggregation network
  • Crowd counting method based on multi-scale space guide perception aggregation network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0090] It should be noted that the embodiments of the present invention and the features of the embodiments may be combined with each other under the condition of no conflict.

[0091]In the description of the present invention, it should be understood that the terms "center", "portrait", "horizontal", "top", "bottom", "front", "rear", "left", "right", " The orientation or positional relationship indicated by vertical, horizontal, top, bottom, inner, outer, etc. is based on the orientation or positional relationship shown in the drawings, and is only for the convenience of describing the present invention and The description is simplified rather than indicating or implying that the device or element referred to must have a particular orientation, be constructed and operate in a particular orientation, and therefore should not be construed as limiting the invention. In addition, the terms "first", "second", etc. are used for descriptive purposes only, and should not be construe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a crowd counting method based on a multi-scale space guide perception aggregation network. The crowd counting method comprises the following steps: S1, establishing a multi-scale feature extraction network; s2, inputting the image with any resolution into a multi-scale feature extraction network; s3, inputting the multi-scale features captured by the multi-scale feature extraction network into a space guidance network, and outputting context guidance perception features and a guidance perception graph; s4, transmitting the context guide perception features and the guide perception map to an attention fusion network, finally outputting a density map, and constructing a density map training set; s5, establishing an adaptive scale loss function, and performing adaptive training through the density map training set; and S6, taking the to-be-predicted image as input, repeating the steps S2-S5, and outputting a crowd calculation result in the to-be-predicted image. Multi-scale information aggregation is carried out on a self-adaptively captured space environment by utilizing a reasonable and efficient guiding method, and the counting accuracy and robustness are improved.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a crowd counting method based on a multi-scale space-guided perceptual aggregation network. Background technique [0002] With the increasing concentration of the global urbanized population, computer vision-based crowd counting and recognition technologies play an important role in public safety, abnormal event detection, and urban traffic management. In sparse scenes containing single or multiple objects in the image, the object localization detection technology can easily and accurately perform crowd counting and identification. Due to the gradual development of deep learning, methods based on convolutional neural networks (CNN) have achieved remarkable success in tasks such as image classification, pedestrian detection, and speech recognition. Therefore, researchers introduced CNN into the field of crowd counting, and achieved good crowd density estimation results in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V20/52G06V10/80G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F18/253
Inventor 张硕郑小青俞勇孔亚广赵晓东
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products