Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Intelligent robot grabbing method based on action demonstration teaching

A kind of intelligent robot and robot technology, applied in the field of robot learning, can solve problems that affect algorithm performance, difficult to transfer, rely on human experience, etc., and achieve the effect of strong adaptability and robustness

Pending Publication Date: 2020-11-06
GUANGZHOU INST OF ADVANCED TECH CHINESE ACAD OF SCI
View PDF5 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when using traditional machine learning methods to grasp unknown objects, manual design of features is time-consuming and laborious, and relies heavily on human experience. The quality of features directly affects the performance of the algorithm. The grasping system has poor generalization ability and is difficult to transfer only for the current task. to problems in other tasks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Intelligent robot grabbing method based on action demonstration teaching
  • Intelligent robot grabbing method based on action demonstration teaching
  • Intelligent robot grabbing method based on action demonstration teaching

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0068] figure 1 A schematic flow chart of the intelligent robot grasping method based on action demonstration and teaching of the present invention is given, as shown in figure 1 As shown, the present invention provides a kind of intelligent robot grasping method based on action demonstration teaching, comprising the following steps:

[0069] Step S1: Complete the hardware environment construction of the action demonstration teaching programming system;

[0070] Step S2: The human demonstrates the grasping operation to form a human teaching action video, and the human uses the teaching pendant to control the robot to complete the demonstration grasping action to form a robot teaching action video;

[0071] Step S3: Denoise and expand the data sets of both the human teaching action video and the robot teaching action video;

[0072] Step S4: Use the meta-learning algorithm to automatically learn prior knowledge directly from the teaching actions of humans and robots to realiz...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an intelligent robot grabbing method based on action demonstration teaching, and relates to the technical field of robot learning. The method comprises the following steps thatthe hardware environment building of an action demonstration teaching programming system is completed; a person demonstrates the grabbing operation to form a human teaching action video, and the person uses a demonstrator to control a robot to complete the demonstrating grabbing action to form a robot teaching action video; data of human and robot teaching action videos are gathered to be subjected to denoising and expanding operation; a meta-learning algorithm is adopted to directly and automatically learn priori knowledge from teaching actions of human and the robot so as to realize learning of new tasks. According to the meta-learning algorithm provided by the intelligent robot grabbing method based on the action demonstration teaching, one-eye learning of imitation learning can be realized in different background environments, different human demonstrators and different robots, and learning of an adaptive target loss function is realized by using time convolution, so that the network can capture multiple frames of human action image information at the same time; and the method has strong adaptability and robustness.

Description

technical field [0001] The invention relates to the technical field of robot learning, in particular to an intelligent robot grasping method based on action demonstration and teaching. Background technique [0002] In recent years, robotic automation systems have developed rapidly and are widely used in industrial and household environments. They have played an important role in solving the problem of social aging and accelerating industrial upgrading and transformation. Among them, robot grasping technology is an important part of robot automation system, which involves almost all applications related to robot operation, such as industrial robot sorting and handling, household robot desktop cleaning, etc. However, most of the current mature grasping systems are designed around a structured operating environment, relying on the pre-acquired target model information to plan the grasping process, the grasping target is single and the application scenarios are relatively fixed,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): B25J9/16
CPCB25J9/163B25J9/1697B25J9/1664Y02P90/02
Inventor 雷渠江徐杰李秀昊桂光超潘艺芃王卫军韩彰秀
Owner GUANGZHOU INST OF ADVANCED TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products