Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A visual robotic arm grasping method and device applied to parametric parts

A parametric, robotic arm technology, applied in the field of visual grasping, which can solve the problems of object models that are difficult to apply to different parts families, and the workload is large.

Active Publication Date: 2021-05-14
广州富唯智能科技有限公司
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since the grasping methods commonly used in the current technology are mostly aimed at the single object grasping scene with a fixed template, it is necessary to build all the part templates in the entire part family in advance, and the workload is large.
Moreover, different part families have different feature distributions, and traditional visual recognition and grasping methods are difficult to apply to object models of different part families.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A visual robotic arm grasping method and device applied to parametric parts
  • A visual robotic arm grasping method and device applied to parametric parts
  • A visual robotic arm grasping method and device applied to parametric parts

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] Embodiments of the present invention will be described in detail below. It should be emphasized that the following description is only exemplary and not intended to limit the scope of the invention and its application.

[0050] In the automatic assembly line, it is required to grab various industrial parts from the material box by vision, and place them on the part pose adjuster with roughly correct posture. Industrial parts in real life are basically parametric, that is, they have a unified part template, but the specific size values ​​are different. The embodiment of the present invention realizes the visual robotic arm grasping of parametric parts based on deep learning.

[0051] figure 1 It is a flow chart of a visual manipulator grasping method applied to parametric parts according to an embodiment of the present invention. The visual robotic arm grasping method provided by the embodiment of the present invention can realize the grasping task of parametric parts...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A visual manipulator grasping method and device applied to parametric parts, the method comprising: S1, obtaining scene point clouds of parametric parts, removing background information to obtain target point clouds; S2, inputting target point clouds into parameterized points The cloud deep neural network maps the target point cloud to the target point in the feature vector space according to the descriptor mapping function generated by the parameterized point cloud deep neural network; For the manifold of the part family, by calculating the feature point closest to the target point on the surface of the manifold, the specific parameter value of the target object and the corresponding target template are obtained; S4. According to the target template, the 6D pose of the parameterized part is obtained through the alignment algorithm Information; S5. Transmitting the 6D pose information of the parameterized part to the control system of the robotic arm to realize the grasping of the parameterized part. The invention is suitable for grabbing parts of different part families, saves a lot of calculation time, and has strong robustness and versatility.

Description

technical field [0001] The invention relates to visual grasping, in particular to a visual manipulator grasping method and device applied to parametric parts. Background technique [0002] my country's demand for robots is increasing day by day, and robots are being widely used in the field of parts grabbing. However, most of the robots currently being used in the society lack intelligence, and it is difficult to adapt to the production mode of many varieties and small batches and the unstructured industrial environment in industry. [0003] Visual grasping refers to guiding the robot to grasp the target object through the visual system. CN201511005603 discloses a visual robot grasping method based on product information labels. The image processing control device obtains the position and size information of the object, and finally controls the robot to realize the grasping action. CN201810034599 discloses a robot grasping method based on depth vision, using an edge detect...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16G06T17/00G06T7/80G06T7/70
CPCB25J9/161B25J9/1664G06T17/00G06T7/70G06T7/80
Inventor 曾龙林垟钵董至恺俞佳熠赵嘉宇
Owner 广州富唯智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products