Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A static gesture recognition method based on a multi-scale convolution neural network

A technology of convolutional neural network and gesture recognition, which is applied in the field of static gesture recognition of multi-scale convolutional neural network, can solve the problems of large influence of external environment, insufficient extraction of features, fineness and poor stability, etc., so as to reduce consumption, The effect of dimensionality reduction

Inactive Publication Date: 2019-01-15
CENT SOUTH UNIV
View PDF3 Cites 54 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The problem to be solved by the present invention is: the existing gesture recognition technology based on image processing has deficiencies, such as being greatly affected by the external environment, manual extraction of gesture features is cumbersome, the extraction features are not fine enough, and the stability is not good, etc.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A static gesture recognition method based on a multi-scale convolution neural network
  • A static gesture recognition method based on a multi-scale convolution neural network
  • A static gesture recognition method based on a multi-scale convolution neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0077] Embodiment 1 first collects and preprocesses static gesture picture data under simple background and complex background, and the data is divided into training data and test data; after obtaining the data, the experimental test environment is built, which is divided into hardware and software The hardware environment uses Nvidia's GTX1060 graphics card and Intel's 6th generation i7 processor, and the software environment uses the Ubuntu 16.04 system and the Caffe framework developed by the Berkeley Vision and Learning Center (BVLC); followed by multi-scale convolutional neural networks Network design, that is: determine the number of neural network layers, select appropriate scale features, etc.; then put the marked training data into this network structure for learning; finally input test data samples for testing, and obtain the final static gesture recognition Accuracy; compared with the experimental accuracy obtained by the convolutional neural network framework under ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A static gesture recognition method based on a multi-scale convolution neural network is firstly proposed. The invention is based on the Caffe frame of depth learning to carry out optimization design,and uses the technical principle of image processing to recognize the static gesture picture. Firstly, the static gesture image data in simple background and complex background are collected and preprocessed. The data are divided into training data and test data. After setting up the experiment and testing environment, the convolution neural network based on multi-scale is designed, that is, determining the number of neural network layers, selecting the appropriate scale features, and so on. The training data are put into the network structure for learning and then the test data samples are input for testing, and the recognition accuracy is obtained. The invention can automatically learn gesture features by using a convolution layer and overcomes the shortcomings of manual feature extraction and the shortcomings that common convolution neural network feature extraction is not precise and comprehensive enough and the stability is not good enough, and the recognition accuracy is higher,and the training time is equal. The method has strong flexibility and wide applicability.

Description

technical field [0001] The invention belongs to the technical field of image processing using deep learning, and relates to a static gesture recognition method of a multi-scale convolutional neural network. Background technique [0002] With the rapid development of computer technology, communication technology, hardware equipment, etc., human-computer interaction has become more and more frequent in daily life. Human non-verbal communication (communication of gestures, body postures and facial expressions) accounts for two-thirds of all human communication . Moreover, gestures have the advantages of being natural, intuitive, and easy to learn, and have become a research hotspot. [0003] According to the classification of hardware devices, gesture recognition technology can be mainly divided into data glove-based gesture recognition technology and camera-based gesture recognition technology. Gesture recognition technology based on data gloves has the advantages of accurat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06N3/08
CPCG06N3/08G06N3/084G06V40/113
Inventor 谢斌宋迪喻仲斌
Owner CENT SOUTH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products