Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-view gait identification method based on adaptive three dimensional human motion statistic model

A statistical model and human motion technology, applied in the field of computer vision and pattern recognition, can solve the problems of incomplete multi-view training set and low recognition accuracy, and achieve the effect of solving the problem of inability to deal with occlusion

Inactive Publication Date: 2016-10-26
武汉盈力科技股份有限公司
View PDF3 Cites 30 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the problems that the multi-view training set of the existing gait recognition technology is difficult to complete and the recognition accuracy is not high, the present invention proposes a multi-view gait recognition method based on an adaptive three-dimensional human motion statistical model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-view gait identification method based on adaptive three dimensional human motion statistic model
  • Multi-view gait identification method based on adaptive three dimensional human motion statistic model
  • Multi-view gait identification method based on adaptive three dimensional human motion statistic model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0022] figure 1 The algorithm flow chart of the present invention in the training stage is given.

[0023] The training uses the video of people walking simultaneously captured by the multi-camera system. Extract the person from the image of each frame, and then extract the feature points only for the person area, and perform matching to generate a set of matching points. Each group of matching points is an image point of the same object point on multiple images. Then, based on the principle of the least square error, the three-dimensional coordinates of the object point are solved through collinear equations. In this way, the coordinates of all object space points in the matching point set can be calculated, and the point cloud of each frame of the human body can be generated. Assuming that all cameras shoot at the same speed, and a complete gait c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-view gait identification method based on an adaptive three dimensional human motion statistic model. Images of a training set of the invention are from a plurality of shooting systems, and target point cloud is generated through the multi-vision three dimensional reconstruction technology so as to construct a three dimensional statistic human body model. A virtual camera performs projection transformation on the three dimensional statistic human body model to obtain a composite human body contour binary image under any visual angle to be used for further extracting various gait characteristics. Based on the three dimensional human body model, a skeleton model is established to provide a reasonable range for the free degree of each joint; in addition, a three dimensional human motion statistic model is established, which is adaptive to various different walking occasions through parameter adjustment. In a training phase, a gait characteristic database is established based on the method. In an identification phase, same gait characteristics are extracted from videos to compare with characteristics in the database; and an optimum identification object is found based on a nearest specimen classifier and a highest scoring strategy.

Description

technical field [0001] The invention relates to computer vision and pattern recognition, in particular to a gait recognition method based on an adaptive three-dimensional human motion statistical model. Background technique [0002] Gait recognition technology is a kind of biometric identification technology, that is, to identify individuals through each person's unique walking style. Compared with the first generation of biometric recognition technology, such as fingerprint recognition, face recognition, iris recognition, etc., gait recognition technology has the advantages of no need for physical contact, low requirements for image resolution, and long-distance recognition. So far, gait characteristics may be the only biological characteristics that can be recognized remotely. Therefore, gait recognition technology has broad commercial application prospects in security monitoring and other fields. [0003] With the development of science and technology and the improvement...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06T7/00G06T7/20
CPCG06T2207/10016G06T2207/30196G06T2207/20081G06V40/25
Inventor 巨辉杨斌曹顺
Owner 武汉盈力科技股份有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products