Static gesture recognition method based on machine vision attention mechanism

A machine vision and gesture recognition technology, applied in the field of car cockpit interaction, can solve the problems of missing recognition, limited computing power, increased cost, etc., achieve the best balance between accuracy and performance, improve classification accuracy, and enhance the effect of expression ability

Pending Publication Date: 2022-06-28
CHONGQING CHANGAN AUTOMOBILE CO LTD
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, because the performance of various end-side devices is not as powerful as the server end, and the computing power is very limited, the accuracy of gesture recognition is still relatively low.
[0004] Chinese patent 201711102008.9 discloses a dynamic gesture recognition method based on computer vision. This patent proposes an end-to-end dynamic gesture recognition method. After collecting gesture images, cluster the real data frame manually marked to improve GoogLeNet As a network framework, the network constructs and trains an end-to-end convolutional neural network that can simultaneously predict the location, size, and category of target gestures. This method puts hand detection and classification in one network for prediction, thereby reducing computational overhead Meet the computing power requirements of the end-side equipment, but because the actual application scenarios are often very complex, the gesture complexity is high, and the environmental interference is strong (such as light changes, human interference, occlusion, etc.), resulting in a low recognition rate
This method relies on additional gesture recognition sensors, which increases costs, and consumers are not willing to accept it, making it difficult to commercialize
In addition, classification and recognition are performed simply by calculating the distance from the standard template, which cannot cope with the complex scenes in the car during actual use, resulting in misidentification and missed recognition, affecting user experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Static gesture recognition method based on machine vision attention mechanism
  • Static gesture recognition method based on machine vision attention mechanism
  • Static gesture recognition method based on machine vision attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] The present invention will be further described in detail below with reference to the accompanying drawings and specific embodiments.

[0041] A static gesture recognition method based on machine vision attention mechanism, its flow chart is as follows figure 1 shown, including the following steps:

[0042] (1) Data collection: RGB cameras are used to collect multi-category static gesture images;

[0043] (2) Gesture area detection: After detecting the gesture area in the static gesture image, crop it to obtain the gesture image, and then save it, and divide the gesture images of all categories into training set, verification set and test set;

[0044] (3) Data format conversion: Convert the gesture pictures obtained in step (2) from RGB format to YUV format, so that the gesture pictures in the training set, validation set and test set are all in YUV format;

[0045](4) Build a convolutional neural classification network: use the MobileNetV2 network as the network fra...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a static gesture recognition method based on a machine vision attention mechanism. The static gesture recognition method comprises the following steps: acquiring multiple types of static gesture images by adopting an RGB (Red, Green and Blue) camera; obtaining gesture pictures, storing the gesture pictures, and dividing the gesture pictures into a training set, a verification set and a test set; converting the gesture picture from an RGB format into a YUV format; taking a MobileNetV2 network as a network framework, adding an attention mechanism module, and constructing a convolutional neural classification network by using a category classification loss function; inputting the training set into a convolutional neural classification network for multiple times of network training, loading a training model into the convolutional neural classification network after the training is completed, directly adopting an RGB camera to collect static gesture images in real time, and inputting the gesture images into the training model for recognition. The category corresponding to the highest probability score is the recognized gesture category. According to the method, no extra hardware is added, the calculation cost can be greatly saved, the gesture recognition rate is improved, the false recognition rate is reduced, and the cockpit gesture control function experience of a user is improved.

Description

technical field [0001] The invention belongs to the technical field of automobile cockpit interaction, and in particular relates to a static gesture recognition method based on a machine vision attention mechanism. Background technique [0002] In recent years, with the development of related sciences such as computer vision and machine learning, human-computer interaction technology has become increasingly mature and has been applied in various fields, especially in the field of automotive intelligent cockpit interaction. The traditional cockpit interaction methods are all based on physical buttons, and gradually developed to touch screen control on the vehicle side, and then to voice and gesture interaction control. Among them, gesture recognition, because it is natural and conforms to user control logic, can bring users The advantages of the sensorless interactive experience can make up for the inconvenience and embarrassment of voice control, making the interaction metho...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06V40/10G06V10/14G06V10/26G06V10/764G06V10/774G06V10/776G06V10/82G06N3/04G06N3/08G06K9/62
CPCG06N3/08G06N3/047G06N3/045G06F18/217G06F18/2415G06F18/214
Inventor 袁聪
Owner CHONGQING CHANGAN AUTOMOBILE CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products