Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Binocular vision gesture recognition method and device based on range finding assistance

A binocular vision and gesture recognition technology, applied in the field of computer vision, can solve the problem of low gesture recognition accuracy, and achieve the effects of simple device structure, strong reliability and improved accuracy

Inactive Publication Date: 2018-01-09
GUANGZHOU UNIVERSITY
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The object of the present invention is to propose a binocular vision gesture recognition method and device based on ranging assistance, which can effectively improve Accuracy of Gesture Recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Binocular vision gesture recognition method and device based on range finding assistance
  • Binocular vision gesture recognition method and device based on range finding assistance
  • Binocular vision gesture recognition method and device based on range finding assistance

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0038] refer to figure 1 , this embodiment provides a binocular vision gesture recognition method based on ranging assistance, including:

[0039] Step S101, collecting images through the binocular vision unit, and acquiring distance measurement data through the distance measurement unit at the same time;

[0040] Step S102, calculating the depth distance from the binocular vision unit to the hand according to the ranging data;

[0041] Step S103, processing the image collected by the binocular vision unit according to the principle of stereo imaging to generate a depth image;

[0042] Step S104, according to the depth distance, extract images whose depth values ​​are within a preset range from the depth image for gesture segmentation, and obtain a gesture segmentation map;

[0043] Step S105, performing gesture recognition on the gesture segmentation map, and outputting a gesture recognition result.

[0044] The binocular vision gesture recognition method based on distance...

Embodiment 2

[0061] refer to image 3 , this embodiment provides a binocular vision gesture recognition device based on ranging assistance, including a binocular vision unit 201, a ranging unit 202, and a control device 203;

[0062] The binocular vision unit 201 is used to collect images and send them to the control device 203;

[0063] The ranging unit 202 is used to obtain ranging data while the binocular vision unit 201 collects images;

[0064] The control device 203 is used to calculate the depth distance from the binocular vision unit to the hand according to the ranging data, and process the images collected by the binocular vision unit according to the principle of stereo imaging to generate a depth image. According to the depth distance, from Extracting images with depth values ​​within a preset range from the depth image for gesture segmentation to obtain a gesture segmentation map, performing gesture recognition on the gesture segmentation map, and outputting a gesture recogni...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a binocular vision gesture recognition method based on range finding assistance. The method comprising steps of: acquiring an image by using a binocular vision unit and obtaining range finding data by using a range finding unit; and calculating a depth distance between the binocular vision unit and a hand according to the range finding data; processing the image acquired bythe binocular vision unit according to a stereoscopic imaging principle to generate a depth image; and extracting an image with a depth value within a preset range from the depth image according to the depth distance and obtaining a gesture segmentation map; and performing gesture recognition on the gesture segmentation map and outputting a gesture recognition result. The method assists the measurement by the range finding unit, segments a hand region according to the obtained range finding data, can improve the accuracy of the gesture segmentation, and thus improves the accuracy of gesture recognition.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a binocular vision gesture recognition method and device based on ranging assistance. Background technique [0002] Gesture recognition has a wide range of applications in human-computer interaction, game entertainment, and vehicle applications. The gesture recognition device perceives the gesture environment through vision, infrared, radar, etc., and the application of visual perception is the most common. The traditional visual gesture recognition is mainly based on the binocular camera, which collects the depth image through the binocular camera, and then performs gesture feature extraction and gesture classification recognition. [0003] However, there are two deficiencies in the existing gesture recognition based on binocular cameras: 1) some binocular camera devices, such as Microsoft's 3D somatosensory device kinect, can obtain image depth information for gesture ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06T7/11G06T7/521
Inventor 綦科
Owner GUANGZHOU UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products