Pedestrian retrieval method for carrying out multi-feature fusion on the basis of neural network

A multi-feature fusion and neural network technology, which is applied in neural learning methods, biological neural network models, special data processing applications, etc., can solve the problem that one feature distance cannot be used to represent the similarity, and the similarity between other pedestrians and pedestrians to be queried low, complex retrieval process and feature distance, to achieve the effect of solving low accuracy and retrieval similarity, improving convenience, and easy application

Inactive Publication Date: 2017-08-22
JINGZHOU POWER SUPPLY COMPANY STATE GRID HUBEI ELECTRIC POWER +1
View PDF2 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] At present, there are many methods for image retrieval and pedestrian retrieval, such as image retrieval method CEDD (CEDD: Color and Edge Directivity Descriptor. A Compact Descriptor for Image Indexing and Retrieval, Savvas A.2008), pedestrian retrieval method WHOS (Person Re-Identification by Iterative Re- Weighted Sparse Ranking, Giuseppe Lisanti, 2015), these methods can achieve better retrieval results for some scientific research datasets, such as the pedestrian retrieval dataset ViPeR Dataset (https: / / vision.soe.ucsc.edu / node / 178) , but for the pedestrians in the actual surveillance video, the retrieval is not ideal, and it needs to be integrated to form a new retrieval feature
[0003] From the perspective of retrieval results, some methods, such as WHOS, although the retrieval results include the pedestrians to be queried, that is, the retrieval is successful, but the similarity between other pedestrians in the retrieval results and the pedestrians to be queried is not high, and cannot provide users with more references For example, if the pedestrians to be queried have blue upper body and black trousers lower body, many of the pedestrians who rank first in the search results are not “blue upper body and black lower body”; while others can be retrieved by body parts Methods, such as: AGeneral Method for Appearance-based People Search Based on Textual Queries, R. Satta, 2012, the similarity between the retrieval results and the pedestrians to be queried is relatively large, but the retrieval process and feature distance are relatively complicated, and require more explicit use One feature distance and retrieval filtering link, cannot use one feature distance to represent the similarity

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian retrieval method for carrying out multi-feature fusion on the basis of neural network
  • Pedestrian retrieval method for carrying out multi-feature fusion on the basis of neural network
  • Pedestrian retrieval method for carrying out multi-feature fusion on the basis of neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] Attached below figure 1 ~9 further describes the pedestrian retrieval method based on neural network for multi-feature fusion.

[0031] (1) Dimensions of the foreground mask:

[0032] For detected pedestrians, adopt GMM (Gaussian Mixture Model) to calculate the GMM foreground mask in the pedestrian box, change the color of the part corresponding to the background in the GMM foreground mask in the RGB image block contained in the box to gray, To eliminate the interference of the background area; then scale the height and width to the standard size PxQ, and the CNN foreground mask is a PxQ-dimensional matrix, and each element has only 3 values: background 0, upper body 1 and lower body 2; see Figure 4 , the following data constitute the combination feature of the pedestrian mask, all of which are PxQ dimensional matrices: the magnitude of the optical flow vector, the direction of the optical flow vector, the GMM foreground mask, the R part, the G part, and the B part of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a video analysis technology, in particular to a pedestrian retrieval method for carrying out multi-feature fusion on the basis of a neural network. Pedestrian calculation features detected in a video are stored into a feature database of a set of pedestrians to be detected, and then, features are calculated for a pedestrian to be detected and are compared with a feature database to obtain a retrieval result which ranks top and is high in similarity. Through the calculation of various retrieval features and feature distances, an optimal distance weight W is used for carrying out integration, so that the retrieval result is similar to the upper part of the body and the lower part of the body of the pedestrian to be inquired, and one feature distance is used for carrying out sorting to improve retrieval convenience. The method has the characteristics of wide applicable range, high accuracy and convenience in application. By use of the method, the problems of low accuracy and retrieval similarity in pedestrian detection based on a surveillance video are solved, and the problems that various detection methods and feature distances can not be favorably combined, the applicable range is narrow and application is complex can be overcome.

Description

technical field [0001] The invention relates to a video analysis technology, in particular to a pedestrian retrieval method based on neural network for multi-feature fusion. Background technique [0002] At present, there are many methods for image retrieval and pedestrian retrieval, such as image retrieval method CEDD (CEDD: Color and Edge Directivity Descriptor. A Compact Descriptor for Image Indexing and Retrieval, Savvas A.2008), pedestrian retrieval method WHOS (Person Re-Identification by Iterative Re- Weighted Sparse Ranking, Giuseppe Lisanti, 2015), these methods can achieve better retrieval results for some scientific research datasets, such as the pedestrian retrieval dataset ViPeR Dataset (https: / / vision.soe.ucsc.edu / node / 178) , but for the pedestrians in the actual surveillance video, the retrieval is not ideal, and it needs to be integrated to form a new retrieval feature. [0003] From the perspective of retrieval results, some methods, such as WHOS, although ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F17/30G06K9/00G06K9/62G06N3/08
CPCG06F16/784G06N3/08G06V40/103G06V20/46G06F18/22G06F18/214
Inventor 吴耀文周学平廖宜良张修吴颖波张勇
Owner JINGZHOU POWER SUPPLY COMPANY STATE GRID HUBEI ELECTRIC POWER
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products