Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deformable object grabbing method and device and computer readable storage medium

An object and deformation technology, applied in the field of computer-readable storage media, can solve the problems of change, grasping quality evaluation of non-deformable objects, high delay, etc.

Active Publication Date: 2021-05-18
SHENZHEN GRADUATE SCHOOL TSINGHUA UNIV
View PDF11 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The disadvantage of this visual-tactile fusion method is that the tactile information captured by the tactile sensor is a sequential signal, and the information changes with time; while the image information captured by traditional cameras such as binocular cameras or depth cameras is a discrete signal, and the information Varies with spatial location, which poses a challenge for visual-tactile information fusion
In addition, traditional cameras capture images in certain image frames, and these frames have the disadvantages of high redundancy, high delay, and high data volume; the direct fusion of highly redundant visual information and relatively more streamlined tactile information will lead to tactile Information has very limited impact on the overall outcome
What is more challenging is that for objects that can deform, the deformation often occurs in an instant, and the deformed images captured by traditional cameras often have problems of image blur and information loss, which has a serious impact on grasping planning.
[0006] In the existing technology, the traditional visual-tactile fusion method does not work well with traditional cameras and tactile sensors. Using standard cameras and tactile sensors cannot effectively evaluate the grasping quality of deformable objects.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deformable object grabbing method and device and computer readable storage medium
  • Deformable object grabbing method and device and computer readable storage medium
  • Deformable object grabbing method and device and computer readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] In order to make the technical problems, technical solutions and beneficial effects to be solved by the embodiments of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0034] It should be noted that when an element is referred to as being “fixed” or “disposed on” another element, it may be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or indirectly connected to the other element. In addition, the connection can be used for both fixing function and circuit communication function.

[0035] It is to be understood that the terms "length", "width", "top", "bottom", "front"...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a deformable object grabbing method and device and a computer readable storage medium. The deformable object grabbing method comprises the following steps: the visual information of a to-be-grabbed deformable object is obtained through an event camera, and positioning and three-dimensional reconstruction are carried out on the to-be-grabbed deformable object; a grabbing point set is obtained on the surface of the to-be-grabbed deformable object subjected to three-dimensional reconstruction; visual and tactile information is collected at the grabbing moment of each grabbing point in the grabbing point set through the event camera and a tactile sensor; the collected visual and tactile information is input into a trained grabbing quality evaluation network, the pre-result category of grabbing is judged, and the result category comprises sliding grabbing, stable grabbing and excessive grabbing; if the result category is stable grabbing, the deformable object to be grabbed is grabbed; and if the result is sliding grabbing or excessive grabbing, grabbing pre-attempt is continuously conducted on the grabbing points in the grabbing point set till the grabbing points with the stable grabbing result category are found.

Description

technical field [0001] The present invention relates to the technical field of grasping deformable objects, in particular to a method, device and computer-readable storage medium for grasping deformable objects. Background technique [0002] The stable grasping control of objects is one of the basic and hot issues in the study of the interactivity of robotic dexterous hands. Traditional robots move to a fixed point according to a fixed program to perform simple grasping operations. With the development of artificial intelligence technology, especially the remarkable progress of machine vision and deep learning, the use of visual information for grasping planning is the current mainstream direction of robot grasping. However, grasping planning is a complex problem. Although visual information can be used to estimate the pose and shape of objects more accurately, it is difficult to obtain important information such as mass distribution and surface roughness of objects from vi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): B25J9/16G06T7/70G06N3/04G06N3/08
CPCB25J9/1697B25J9/161G06T7/70G06N3/049G06N3/08
Inventor 梁斌石琰刘厚德王学谦
Owner SHENZHEN GRADUATE SCHOOL TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products