Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cascaded convolutional neural network-based quick detection method of irregular-object grasping pose of robot

A convolutional neural network and detection method technology, which is applied in the field of robot vision technology detection and grasping control, can solve problems such as not being well used to solve robot grasping posture detection, and overcome the influence of grasping posture detection. , improve real-time performance, improve the effect of grasping detection accuracy

Inactive Publication Date: 2018-09-07
SOUTHEAST UNIV
View PDF5 Cites 44 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] In short, deep learning technology has been initially applied in the field of robotics, but it has not been well used to solve the problem of robot grasping attitude detection, especially how to reduce the time consumption of grasping positioning and improve the accuracy of grasping attitude estimation is an important issue for online grasping of robots. Important issues to be solved urgently

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cascaded convolutional neural network-based quick detection method of irregular-object grasping pose of robot
  • Cascaded convolutional neural network-based quick detection method of irregular-object grasping pose of robot
  • Cascaded convolutional neural network-based quick detection method of irregular-object grasping pose of robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0041] The robot and camera configuration adopted in the embodiment are as follows: figure 1 shown, including a low-cost top-down color camera (resolution ), a UR5 robot that has been calibrated by hand-eye. The computer configuration used for model training and experiment implementation is Intel(R) Core(TM) i7 3.40GHz CPU, NVIDIA GeForce GTX 1080TI graphics card, 16GB memory, and the operating system is Ubuntu 16.04.

[0042] figure 2 The definition relations of each coordinate system are given in . In order to facilitate the correspondence between the grasping detection results and the robot's grasping pose, the grasping pose detection results under the image are represented by a simplified "dot-line method". The center point of the grabbing position in the image is recorded in the image coordinate system as , corresponding to the midpoint of the line connecting the two fingers of the robot end effector; the grasping center line in the image corresponds to the line co...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a cascaded convolutional neural network-based quick detection method of an irregular-object grasping pose of a robot. Firstly, a cascaded-type two-stage convolutional-neural-network model of a position-attitude rough-to-fine form is constructed, in a first stage, a region-based fully convolutional network (R-FCN) is adopted to realize grasping positioning and rough estimation of a grasping angle, and in a second stage, accurate calculation of the grasping angle is realized through constructing a new Angle-Net model; and then current scene images containing to-be-grasped objects are collected to be used as original on-site image samples to be used for training, the two-stage convolutional-neural-network model is trained by means of a transfer learning mechanism, then each collected monocular color image is input to the cascaded-type two-stage convolutional-neural-network model in online running, and finally, an end executor of the robot is driven by an obtainedgrasping position and attitude for object grasping control. According to the method, grasping detection accuracy is high, detection speed of the irregular-object grasping pose of the robot is effectively increased, and real-time performance of running of a grasping attitude detection algorithm is improved.

Description

technical field [0001] The invention relates to a method for robot autonomous grasping posture detection, specifically a method for rapidly detecting the posture and posture of a robot's irregular object grasping based on a cascaded convolutional neural network, which belongs to robot vision technology detection and grasping control technology field. Background technique [0002] In robot sorting, handling and other grasping tasks, Planar Grasp, including Top-grasp and Side-grasp, is the most commonly used grasping strategy for robots. For unknown irregular objects with arbitrary poses, in scenes with uneven illumination and complex backgrounds, how to use low-cost monocular cameras to achieve fast and reliable robot autonomous grasp pose detection is a great challenge. [0003] Robot autonomous grasping attitude planning methods can be divided into two categories according to different perception information: one is the grasping attitude estimation based on the object mode...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06T7/70
CPCG06T7/70G06T2207/10024G06T2207/20084G06T2207/20081G06T2207/30164G06N3/045
Inventor 钱堃夏晶刘环张晓博马家乐康栓紧
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products