Cross-view gait recognition method based on block horizontal pyramid spatial-temporal feature fusion model and gait reordering

A technology of spatio-temporal features and fusion models, which is applied in the field of deep learning and pattern recognition, can solve the problem that gait recognition has not been introduced, and achieve the effects of improved recognition rate and robustness, improved matching accuracy, and low computational complexity

Active Publication Date: 2021-09-24
SHANDONG UNIV
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In addition, re-ranking technology has been proven to have superior performance in the fields of identity recognition such as face recognition and pedestrian re-identification, but it has not been introduced in gait recognition.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cross-view gait recognition method based on block horizontal pyramid spatial-temporal feature fusion model and gait reordering
  • Cross-view gait recognition method based on block horizontal pyramid spatial-temporal feature fusion model and gait reordering
  • Cross-view gait recognition method based on block horizontal pyramid spatial-temporal feature fusion model and gait reordering

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0061] A cross-view gait recognition method based on block level pyramid spatio-temporal feature fusion model and gait reordering, including:

[0062] (1) Obtain the training sample set and construct a triplet combination. The triplet combination includes anchor samples, positive samples, and negative samples. The perspective of the anchor sample is a certain perspective, and the perspective of the positive sample is different from the perspective of the anchor sample. The identity of the positive sample The same as the identity of the anchor sample, the perspective of the negative sample is arbitrary and different from the identity of the anchor sample; for example figure 1 Shown:

[0063] A. Preprocessing the gait contour map, correcting the contour of the gait contour map to avoid interference caused by the different distance between the pedestrian and the camera; adjusting the size of the corrected gait contour map;

[0064] Given a gait dataset containing N pedestrians a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a cross-view gait recognition method based on a block horizontal pyramid spatial-temporal feature fusion model and gait reordering. The method comprises the steps that a training sample set is obtained, a triple combination is constructed, and after preprocessing, a block horizontal pyramid spatial-temporal feature fusion model is constructed and trained; a registration sample set and a to-be-recognized gait sample are obtained, preliminary gait recognition is carried out after preprocessing, the registration sample set and the gait sample of the to-be-recognized identity are sent into the trained block horizontal pyramid spatial-temporal feature fusion model to obtain a registration feature library and gait features, and a candidate set of preliminary candidate identities is obtained through Euclidean distance calculation; and according to the candidate set, a mutual neighbor feature set is calculated, the identity of the sample corresponding to the feature with the nearest Euclidean distance is output as the identity of the to-be-recognized sample, and a final recognition result is obtained. According to the method, the spatial and temporal features are fused, the spatial and temporal features can be fused together under the conditions of relatively low calculation complexity and no increase of learning parameters, and the recognition rate is high.

Description

technical field [0001] The invention relates to a cross-view gait recognition method based on a block level pyramid spatio-temporal feature fusion model and gait reordering, and belongs to the technical field of deep learning and pattern recognition. Background technique [0002] Gait recognition is a biometric technology that uses people's walking posture for identification. It distinguishes individual identities based on the difference in gait between different individuals, and according to the difference in people's walking postures. Compared with existing biometric recognition technologies such as face recognition, fingerprint recognition, vein recognition, iris recognition and other technologies, gait recognition has the characteristics of low resolution, little environmental impact, and easy collection. In addition, technologies such as existing face recognition, fingerprint recognition, vein recognition, and iris recognition require the contact and cooperation of the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62
CPCG06F18/253G06F18/214
Inventor 贲晛烨翟鑫亮陈雷李玉军魏文辉宋延新
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products