A robot manipulator control method based on multi-leapmotion virtual gesture fusion

A technology of manipulators and gestures, applied in the direction of manipulators, program-controlled manipulators, manufacturing tools, etc., can solve problems such as the complexity of the robot control process, and achieve the effects of low cost, strong fault tolerance, and high accuracy

Active Publication Date: 2019-08-30
CENT SOUTH UNIV
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention provides a robot manipulator control method based on the fusion of multiple Leapmotion virtual gestures, the purpose of which is to overcome the complicated problem in the prior art that wearable devices control the robot

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A robot manipulator control method based on multi-leapmotion virtual gesture fusion
  • A robot manipulator control method based on multi-leapmotion virtual gesture fusion
  • A robot manipulator control method based on multi-leapmotion virtual gesture fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] The present invention will be further described below in conjunction with the drawings and embodiments.

[0047] A robot manipulator control method based on the fusion of multiple Leapmotion virtual gestures includes the following steps:

[0048] Step 1: Set up gesture collection device;

[0049] Set at least two leapmotion sensors on the inner center of the upper and lower surfaces of the gesture collection area;

[0050] Step 2: Collect leapmotion sequence images of gestures that control the robotic arm based on the gesture acquisition device, and use the gesture recognition model based on the nuclear extreme learning machine to recognize the gestures;

[0051] The gesture recognition model based on the nuclear extreme learning machine uses the leapmotion sequence images of each gesture collected by the gesture acquisition device as input data in turn, and the category number of the corresponding gesture is used as output data to perform machine learning training on the nuclear...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a robot mechanical arm control method based on multi-Leapmotion virtual gesture fusion. The robot mechanical arm control method comprises the following steps that 1, a gesturecollecting device is arranged; 2, Leapmotion sequence images for controlling gestures of a mechanical arm are collected on the basis of the gesture collecting device, and the gestures are recognized through a gesture recognizing model based on a core extreme learning machine; 3, through the scale factor of preset control gestures and the mechanical arm practical operation gestures, the mechanicalarm operation target destination is obtained; 4, the mechanical arm movement scheme is obtained; and 5, the optimal mechanical arm control scheme is selected. According to the scheme, gesture sequenceimages are collected by using a multi-Leapmotion sensor, the gesture images are fused by using a weighted fusion algorithm, and the very high fault tolerance is achieved; and a multi-Leapmotion gesture recognizing device is used, and compared with an existing mechanical arm demonstrator and somatosensory equipment, a control device is low in cost, high in controllability and high in accuracy degree.

Description

Technical field [0001] The invention belongs to the field of mechanical control, and particularly relates to a robot manipulator control method based on multiple Leapmotion virtual gesture fusion. Background technique [0002] With the development of artificial intelligence, human-computer interaction has attracted more and more attention. How to realize human-computer interaction in a simple, quick and low-cost way is one of the focuses of artificial intelligence research. [0003] In recent years, many solutions using wearable devices to realize human-computer interaction have emerged. Wearable devices use sensors to collect and transmit human body data to the controller to control the robot. Take the hand wearable device as an example. A wearable device that looks like a glove is worn by a person. The device has a variety of sensors, such as an angle sensor, a displacement sensor, and a force sensor. The sensor collects data from the hand and transmits the data to the controll...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16B25J3/00
Inventor 刘辉段超李燕飞黄家豪
Owner CENT SOUTH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products