Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-class out-of-order workpiece robot grabbing pose estimation method based on deep learning

A technology of deep learning and pose estimation, applied in neural learning methods, instruments, calculations, etc., can solve the problems of long cycle, low efficiency, and limited expressive ability of complex functions, and achieve the effect of long cycle and low efficiency

Active Publication Date: 2019-11-08
ZHEJIANG UNIV
View PDF6 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, traditional reinforcement learning methods have great limitations when solving high-dimensional state and action space problems. Under the condition of limited samples and computing units, the ability to express complex functions is limited, and the performance in practical applications is often not ideal.
At the same time, the traditional deep reinforcement learning algorithm needs to provide a large amount of data for training. During the training process, the robot needs to continue to grasp and try and make mistakes, so that it is possible to obtain a stable grasping ability.
This training method has a long cycle and low efficiency, and there are safety hazards in the actual training process, which often cannot meet the needs of industrial production applications.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-class out-of-order workpiece robot grabbing pose estimation method based on deep learning
  • Multi-class out-of-order workpiece robot grabbing pose estimation method based on deep learning
  • Multi-class out-of-order workpiece robot grabbing pose estimation method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The present invention will be further described below in conjunction with drawings and embodiments.

[0035] The training process of the deep learning network of the specific embodiment of the present invention is as follows:

[0036]The implemented system consists of three independent deep learning networks, which are respectively point cloud classification network, position generation network and attitude generation network. The point cloud classification network, position generation network and attitude generation network all adopt the same network structure, specifically including Connected random sampling layer, perceptual layer, pooling layer and the final multi-layer perceptron, the same perceptual layer is composed of multiple multi-layer perceptrons connected in parallel, each multi-layer perceptron in the perceptual layer shares / has the same parameters, random The sampling layer receives the input data for random sampling, and then inputs each set of randomly ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-class out-of-order workpiece robot grabbing pose estimation method based on deep learning. The method comprises steps of adopting independent point cloud classificationnetwork, position generation network and attitude generation network; inputting the point cloud information into the point cloud classification network, and classifying the input point cloud information by the point cloud classification network to obtain the category of the point cloud information; synthesizing the category of the point cloud information and the point cloud information into similar point cloud information, inputting the similar point cloud information into a position generation network and an attitude generation network respectively, processing and predicting the similar point cloud information by the position generation network and the attitude generation network respectively to obtain position information and attitude information, and synthesizing to obtain the pose ofthe robot. According to the method, grabbing pose estimation of multiple types of out-of-order workpieces can be achieved, the method is a brand-new end-to-end implementation method based on deep learning, grabbing programming of specific workpieces can be rapidly achieved only by providing few sets of training data, and the requirements of industrial production can be met.

Description

technical field [0001] The invention relates to a method for estimating the pose of a robot grasping belonging to artificial intelligence, in particular to a method for estimating the pose of a robot grasping a multi-category out-of-sequence workpiece based on deep learning. Background technique [0002] As one of the world's top five industrial robot consumers, China's installation volume increased to 36.0% of the world in 2018. A total of 138,000 industrial robots were installed, a year-on-year increase of 59%. The consumption volume has exceeded the sum of Europe and the United States. Intelligent manufacturing is the main direction of Made in China 2025, and there is a huge demand for intelligent industrial robots. The application of robots for handling and loading and unloading accounts for more than two-thirds, and the added value brought by intelligent upgrading is obvious. [0003] With the development of artificial intelligence, some scholars have begun to study th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/70G06T7/00G06N3/04G06N3/08G06K9/00
CPCG06T7/70G06T7/0004G06N3/08G06T2207/10028G06T2207/20081G06T2207/20084G06T2207/30164G06V20/64G06N3/045
Inventor 傅建中王郑拓徐月同方泽华
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products