Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A static gesture real-time recognition method based on YOLOv3

A recognition method and gesture technology, applied in the field of deep learning and gesture recognition, can solve the problem that the application has not been publicly reported

Active Publication Date: 2019-02-12
HEFEI UNIV OF TECH
View PDF6 Cites 33 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there is no public report on the application of YOLO v3 in the field of gesture recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A static gesture real-time recognition method based on YOLOv3
  • A static gesture real-time recognition method based on YOLOv3
  • A static gesture real-time recognition method based on YOLOv3

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0072] see figure 1 and figure 2 In this embodiment, the static gesture real-time recognition method based on YOLOv3 includes: making a training set, generating a transfer Darknet-53 model, improving candidate frame parameters and real-time gesture recognition steps.

[0073] Step 1. Make the training set according to the following process

[0074] 1.1. Use the Kinect device to shoot four types of Kinect images for each gesture scene, namely: IR image, Registration of RGB image, RGB image and Depth image; the resolution of the captured image is 640×480; In order to improve the robustness of the recognition method, when making the dataset, the image resolution, the number of gestures contained in a single picture, the light intensity, the distance of the shooting, the background, and the overlap of gestures are different. According to the number of gestures contained in a single picture as 1, 2, 3, 4, 5, 7 groups of gesture pictures were taken under different conditions, inc...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a static gesture real-time recognition method based on YOLOv3, which comprises the following steps of making a training set, generating a migration Darknet-53 model, improveingcandidate frame parameters and carrying out the real-time gesture recognition, is based on the convolution neural network YOLOv3 model, uses the four types of image datasets collected by Kinect equipment to replace the common RGB image datasets, and the recognition results of four types of Kinect test images are fused to effectively improve the recognition accuracy. The K-Means clustering algorithm is used to improve the parameters of the initial candidate frame to improve the recognition speed effectively. And a transfer learning method is adopted to reduce the training time of the model.

Description

technical field [0001] The invention relates to the field of deep learning and gesture recognition, in particular to a real-time static gesture recognition method based on YOLOv3. Background technique [0002] In recent years, with the rapid development of artificial intelligence technology, the way of human-computer interaction has also been greatly changed. From typing, touch screen to voice, the development of interactive methods has brought convenience and excellent user experience to people's operations. . However, a more efficient and comfortable way of interaction is to allow machines to directly understand human body language. Gestures are the simplest and most convenient of all kinds of body language. Broad application prospects. [0003] In gesture-based human-computer interaction, a very important process is to recognize gestures. Traditional gesture recognition includes template-based, data-glove-based, and hidden Markov model-based methods; among them, the me...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/113G06V40/117G06F18/23213G06F18/214
Inventor 张勇张强徐林嘉刘佳慧王鑫源
Owner HEFEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products