Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human movement identification method through fusion of deep neural network model and binary system Hash

A technology of deep neural network and human motion recognition, which is applied in character and pattern recognition, instruments, computer components, etc., can solve problems such as high computational complexity, large number of parameters, and capturing motion dynamics, achieving good recognition effect and simple operational effect

Inactive Publication Date: 2018-06-01
CHONGQING UNIV OF POSTS & TELECOMM
View PDF12 Cites 35 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Obviously, using a single frame is not enough to effectively capture the dynamics of the action, and a large number of frames requires a large number of parameters, which leads to model overfitting, requires a larger training set, and has higher computational complexity.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human movement identification method through fusion of deep neural network model and binary system Hash
  • Human movement identification method through fusion of deep neural network model and binary system Hash
  • Human movement identification method through fusion of deep neural network model and binary system Hash

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] The technical solutions in the embodiments of the present invention will be described clearly and in detail below in conjunction with the drawings in the embodiments of the present invention. The described embodiments are only a part of the embodiments of the present invention.

[0039] The technical solutions of the present invention to solve the above technical problems are:

[0040] Attached below Figure 1-2 As shown, a human body action recognition method based on a deep network model and a binary hash method includes the following steps:

[0041] 1. Extract the depth features of the video

[0042] The samples in the experimental video library are divided into training set and test set, and the FC layer features are extracted from all samples. The detailed steps of the extraction method are as follows:

[0043] 1) Split the input video into frames

[0044] In order to extract the local feature information of the video, the input video containing human movements is divided i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a human movement identification method through fusion of a deep neural network model and binary system Hash, belonging to the technical field of mode identification. Themethod comprises the steps of: performing preprocessing of a movement identification database, dividing the movement identification database into frame sequences, calculating an optical flow graph, employing an attitude estimation algorithm to calculate coordinates of human joint points, and employing result coordinates to extract video area frames; employing a pre-training VGC-16 network model to extract FC (Full-Convolutional) features of RGB flows and optical flows of the videos, selecting key frames from the video frame sequences, and obtaining a difference of the FC features corresponding to the key frames; performing binary processing of the difference; employing a binary-hashing method to obtain uniform feature expression of each video; employing a plurality of normalization methods such as L1 and L2 to obtain feature expressions of the videos after the fusion of the uniform feature expressions and the PCNN features; and finally, employing a support vector machine algorithm totrain a classifiers to identify the human movement videos. The human movement identification method through fusion of the deep neural network model and the binary system Hash has a high movement identification correct rate.

Description

Technical field [0001] The invention belongs to the technical field of image and video processing, and particularly relates to a human body action recognition method based on a deep neural network model combined with binary hash. Background technique [0002] In recent years, human action recognition has made great progress in the field of pattern recognition, image processing and analysis, and some human action recognition systems have been put into practical use. Human action recognition algorithms mainly include two steps: action representation and action classification. How to encode human action information is a very critical step for subsequent action classification. Ideally, the action representation algorithm must not only change the appearance, scale, complex background, and action speed of the human body, but also contain enough information to provide the classifier to classify the action types. However, the complex background and the variability of the human body have...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06V20/40G06F18/214
Inventor 李伟生冯晨肖斌
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products