Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot under-actuated hand autonomous grasping method based on stereoscopic vision

A stereo vision and under-actuated technology, applied in the direction of manipulators, program-controlled manipulators, instruments, etc., can solve the problem that complex objects cannot obtain grasping points, etc.

Active Publication Date: 2016-05-25
HARBIN INST OF TECH
View PDF4 Cites 47 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to solve the problem that the current robot grasping method needs to obtain the three-dimensional model of the object in advance to solve the problem of the grasping point and the current robot grasping method can only identify simple objects, but cannot obtain the other objects for complex objects. Corresponding grab point issues

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot under-actuated hand autonomous grasping method based on stereoscopic vision
  • Robot under-actuated hand autonomous grasping method based on stereoscopic vision
  • Robot under-actuated hand autonomous grasping method based on stereoscopic vision

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0051] The autonomous grasping method of the underactuated hand of the robot based on the stereo vision comprises the following steps:

[0052] Step 1. For the object to be captured and its environment, obtain the RGB-D point cloud of the object and the environment through the Kinect sensor, and filter the point cloud;

[0053] The Kinect sensor is a 3D vision sensor launched by Microsoft in November 2010. It includes a color camera and a depth camera, which can directly obtain the color map and depth map in the scene, and then generate the point cloud in the scene; however, due to The point cloud generated by Kinect contains the point cloud of all objects in the scene. The number is huge and the features are complex. It takes a lot of machine time to process and brings trouble to the subsequent processing; therefore, it is necessary to perform certain processing on the obtained point cloud. Preprocessing, extract the point cloud of the object in the point cloud, and perform f...

specific Embodiment approach 2

[0098] The specific steps of the process of filtering the point cloud described in step 1 of the present embodiment are as follows:

[0099] Step 1.1, use the radius outlier removal filter (RadiusOutlierRemoval filter) to remove outliers;

[0100] A small number of outliers due to noise can be removed by using the radius outlier removal filter provided by the PCL library; the filtering process is as follows, assuming that point A is the point that needs to pass the filter, first use the Kd_tree search algorithm to count the points A is the center, and r is the total number of points inside the ball with radius. When the number of points is less than the threshold n, it is considered an outlier point;

[0101] Step 1.2, using an average filter to make the surface of the object smoother.

[0102] The influence of white noise can be removed by using the average value filter; the filtering process is as follows, assuming that point A is the point that needs to be filtered, first ...

specific Embodiment approach 3

[0104] The specific steps of establishing the grasping planning scheme based on the Gaussian process classification described in step 3 of this embodiment are as follows:

[0105] After obtaining the above features, the grasping scheme can be obtained through the machine learning method with a teacher; the reasons for obtaining the grasping scheme by using the machine learning method of Gaussian process classifier: 1) The difference between the actual feature and the ideal feature The errors are generated by noise, so they obey the Gaussian distribution; so these errors can be learned by the Gaussian process; 2) Compared with the support vector machine and neural network, the construction of the Gaussian process classifier is simpler, only need to determine Its kernel function and mean function are enough, and fewer parameters are used at the same time, which makes parameter optimization easier and parameters are easier to converge; 3) Gaussian process classifier can not only g...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robot under-actuated hand autonomous grasping method based on stereoscopic vision, and relates to a robot autonomous grasping method. The problems that a grasping point can not be calculated through an existing robot grasping method until a three-dimensional model of an object is obtained in advance and the existing robot grasping method can only recognize a simple object and can not obtain a corresponding grasping point for a complicated object are solved. The method includes the steps of obtaining RGB-D point cloud of the object and the environment through a Kinect sensor and conducting filtering on the point cloud for a to-be-grasped object and the environment of the object; extracting normal vector included angle characteristics, coplanar characteristics, distance characteristics, grasping stability characteristics, collision detecting characteristics and corresponding constraint equations for the RGB-D point cloud; establishing a grasping planning scheme on the basis of Gaussian process classification; driving an under-actuated hand for grasping according to the grasping scheme, then judging whether the under-actuated hand has already grasped the object or not according to current detection till the under-actuated hand grasps the object, and releasing the object after completing the grasping task. The method is suitable for the field of robot grasping.

Description

technical field [0001] The invention relates to a robot autonomous grasping method. Background technique [0002] Autonomous grasping has long been considered a fundamental component of robotic intelligence. The current robots basically adopt the mode of master-slave operation, and the operator controls the robot through the joystick to complete the grabbing task. This method of operation requires professional training for operators, and the operation is time-consuming and laborious. Therefore, it is particularly important to study an autonomous grasping method. [0003] PlinioMoreno et al. proposed a method of using support vector machines to learn the graspable points of objects, using SVM to learn the local features of graspable parts in the point cloud. After training, their algorithm can find out the graspable points of objects. accessible and non-accessible areas. To validate the training results, they performed grasping simulations in the OCRA simulator. In order ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16G06K9/00G06K9/62G06T7/60
CPCB25J9/1602G06V20/10G06F18/241
Inventor 杜志江王伟东
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products