Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot stirring and grabbing combination method based on deep reinforcement learning

A technology of reinforcement learning and combined methods, applied in the direction of instruments, manipulators, program-controlled manipulators, etc., can solve problems such as insufficient grasping space

Active Publication Date: 2020-12-18
SOUTHEAST UNIV
View PDF3 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Purpose of the invention: In view of the shortcomings of the existing grasping methods and the problem that there is not enough grasping space when the objects are arranged closely, the present invention proposes a combined grasping method based on deep reinforcement learning of robot agitation and grasping , using self-supervised learning to enable the robot to acquire the ability to stir and grasp without the need for a target model and a manually labeled dataset

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot stirring and grabbing combination method based on deep reinforcement learning
  • Robot stirring and grabbing combination method based on deep reinforcement learning
  • Robot stirring and grabbing combination method based on deep reinforcement learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0063] The present invention will be described in further detail below in conjunction with the accompanying drawings and embodiments.

[0064] Such as figure 1 As shown, the robot agitation-grabbing combination method based on deep reinforcement learning provided by the invention comprises the following steps:

[0065] Step 1. Build a robot stirring-grasping execution platform in a real environment, and then build a robot stirring-grasping learning platform in a simulation environment, which is consistent with the robot stirring-grasping execution platform in the real environment;

[0066] Step 2. Model the stirring-grabbing process of the robot as a Markov process, and construct the state space, action space and reward function;

[0067] Step 3. According to the Markov process established in step 2, construct a robot stirring-grasping learning framework based on deep reinforcement learning, and build a deep reinforcement learning network according to the state space, action ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robot stirring and grabbing combination method based on deep reinforcement learning. The robot stirring and grabbing combination method comprises the following steps: building a robot stirring and grabbing execution platform in a real environment and a robot stirring and grabbing execution platform in a simulation environment respectively; modeling a stirring and grabbingprocess of the robot into a Markov process, and constructing a state space, an action space and a reward function; constructing a robot stirring and grabbing learning framework based on deep reinforcement learning and a deep reinforcement learning network; then, repeatedly carrying out stirring and grabbing action attempts on the simulation platform, collecting experience sample data, and carrying out self-supervised training on the deep reinforcement learning network according to the Markov process; and finally, migrating the trained network model to a real environment, and carrying out actual training on the robot in the real environment. When facing a random grabbing scene and a novel grabbing object, the robot can sense and analyze the scene and decide to execute a stirring or grabbing action, an action position and an action direction.

Description

technical field [0001] The invention relates to a robot stirring-grabbing combination method based on deep reinforcement learning, which belongs to the technical field of robot application. Background technique [0002] Grasping is one of the basic operations of robots, and it is also the basis and key for robots to complete various tasks. As my country gradually enters a period of rapid development of robot technology, robot applications are gradually moving from traditional industrial fields to home services, warehousing and logistics, etc. Grabbing simple types of objects in fixed scenes is far from meeting the grasping needs. In the face of unstructured grasping environment and unknown objects, the robot needs to have stronger adaptability and the ability to actively change the environment to reduce the complexity of the scene and improve the success rate of grasping. [0003] The deep learning method can learn and mine feature representations better than manual design ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06F30/17G06N3/04B25J9/16
CPCG06T7/73G06F30/17B25J9/1612B25J9/1697G06N3/045
Inventor 李俊贺笑房子韩侯言旭
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products