Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot Vision Grasping Method

A robot vision and product picking technology, applied in the field of visual recognition, can solve problems such as grasping operation deviation, information deviation, and poor timeliness

Active Publication Date: 2017-07-14
FREESENSE IMAGE TECH
View PDF9 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, this robotic vision grasping method of the prior art has several disadvantages
Specifically, this robotic visual grasping method in the prior art requires a large amount of calculation, and it is necessary to calculate the product size and product position from the photo information
Secondly, due to the complex calculation process, the timeliness of the operation is poor
Moreover, the information obtained from product photos sometimes has deviations, which may cause deviations in the grasping operation of the robot arm on the product

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot Vision Grasping Method
  • Robot Vision Grasping Method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] figure 1 is a schematic diagram of a robot visual grasping method according to a preferred embodiment of the present invention, and figure 2 It is a flow chart of the robot visual grasping method according to the preferred embodiment of the present invention.

[0025] like figure 1 and figure 2 As shown, the robot vision grasping method according to a preferred embodiment of the present invention includes:

[0026] First step S1: arranging a product information label 200 on the product 100 to be captured, wherein the product information label 200 includes the size information of the product 100 to be captured and the position information of the product information label 200 on the product 100 to be captured;

[0027] Among them, preferably, the product information label 200 has a standardized shape and a standardized size.

[0028] Preferably, the product information label 200 is arranged at the corner of the specific surface of the product 100 to be grasped; or, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robot vision grasping method. The robot vision grasping method includes the steps that a product information tag is arranged on a to-be-grasped product, wherein the product information tag comprises size information of the to-be-grasped product and information of the position of the product information tag on the to-be-grasped product; the to-be-grasped product is conveyed through a conveying device; the conveying device is allowed to stop conveying of the to-be-grasped product; a picture of the to-be-grasped product in the static state is obtained through an image obtaining device, wherein the picture comprises an image of the product information tag; the image obtaining device transmits the picture to an image processing control device, and in this way, the image processing control device obtains the information of the position of the product information tag relative to the image obtaining device, the size information of the to-be-grasped product and the information of the position of the product information tag on the to-be-grasped product according to an image of the product information tag in the picture; and the image processing control device transmits corresponding indication information to a robot to control the grabbing action of the robot on the to-be-grasped product in the static state.

Description

technical field [0001] The invention relates to the field of visual recognition and automatic control, in particular to a robot visual grasping method. Background technique [0002] At present, when robots (specifically, robot arms) are used to grab products on the conveyor belt on the assembly line, camera equipment is generally used to take images of the products, and then image processing is performed on the captured images to obtain information such as product dimensions. product information. The product information is then sent to the robot's control unit, enabling the robot to control the specific operations of the robotic arm based on product information, such as product dimensions and product locations as shown in photos. [0003] However, this prior art approach to robotic vision grasping has several disadvantages. Specifically, this robotic visual grasping method in the prior art requires a large amount of calculation, and the product size and product position ne...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J13/08
CPCB25J13/08
Inventor 文海量
Owner FREESENSE IMAGE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products