Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Object grabbing method and device

A technology for grasping devices and objects, which is applied in the field of robotics, can solve the problems of low success rate and poor accuracy of object grasping, and achieve the effect of improving the success rate and accuracy

Active Publication Date: 2019-04-09
BEIJING ORION STAR TECH CO LTD
View PDF5 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] For this reason, the first object of the present invention is to propose a method for object grasping, which is used to solve the problem of low accuracy of grasping points and grasping methods obtained in the prior art, resulting in a low success rate of object grasping

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Object grabbing method and device
  • Object grabbing method and device
  • Object grabbing method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0071] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary and are intended to explain the present invention and should not be construed as limiting the present invention.

[0072] The object grasping method and device according to the embodiments of the present invention will be described below with reference to the accompanying drawings.

[0073] figure 1 It is a schematic flowchart of an object grasping method provided by an embodiment of the present invention. Such as figure 1 As shown, the object grasping method includes the following steps:

[0074] S101. Acquire image data of an object to be captured; the image data includes: three-dimensional coordinate information of each point on the sur...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an object grabbing method and device. The method comprises the steps: acquiring image data of a to-be-grabbed object, Wherein the image data comprises three-dimensional coordinate information of each point on the surface of the to-be-captured object; Generating at least one grabbing mode according to the image data of the to-be-grabbed object, Wherein the grabbing mode comprises three-dimensional coordinate information of at least one grabbing point and a grabbing angle; Inputting the image data and the grabbing mode of the to-be-grabbed object into a grabbing evaluationmodel, and obtaining an evaluation score of the grabbing mode; Selecting a grabbing mode with the evaluation score meeting the rule to grab the to-be-grabbed object; Therefore, a posture estimation algorithm with low robustness is avoided, the trained grabbing evaluation model is adopted, the grabbing mode with the evaluation score meeting the rule is obtained for grabbing operation, and therefore the accuracy of the grabbing mode is improved, and the success rate of object grabbing is increased.

Description

technical field [0001] The invention relates to the technical field of robots, in particular to an object grasping method and device. Background technique [0002] At present, the main process of the robot to grasp the object is to obtain the image data of the object; segment the object according to the image data, identify the object and estimate the pose; So that the robot can grasp the object according to the grasping point and the grasping method. [0003] However, in the above method, before determining the appropriate grasping point and grasping method, it is necessary to segment the object, identify the object, and estimate the pose. However, the robustness of the current pose estimation algorithm is not high, and sensor noise, object occlusion, etc. Both will have a greater impact on the object's pose estimation algorithm, reducing the accuracy of the acquired grasping points and grasping methods, thus affecting the success rate of object grasping. Contents of the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/32
CPCG06V10/255
Inventor 龚星
Owner BEIJING ORION STAR TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products