Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Sequential grabbing method and device for stacked articles

An item and sequence technology, applied in the computer field, can solve problems such as low point cloud accuracy, increased collisions in the box, and easy reflections

Pending Publication Date: 2020-10-09
BEIJING JINGDONG QIANSHITECHNOLOGY CO LTD
View PDF0 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] (1) Although the method of target segmentation through point clouds can achieve layering, the resolution of point cloud images is generally not high, and the accuracy of point clouds is low; moreover, the packaging materials of many items are easy to reflect light, and in the case of reflective The position is easy to cause holes (invalid data) on the point cloud, which leads to low accuracy of segmenting the target directly on the point cloud;
[0008] (2) The method of target segmentation based on 2D images can achieve a high segmentation accuracy, but the height of the items cannot be distinguished on the 2D image, which will easily lead to the grabbing of items at low-level positions and increase the probability of collision in the box

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sequential grabbing method and device for stacked articles
  • Sequential grabbing method and device for stacked articles
  • Sequential grabbing method and device for stacked articles

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] Exemplary embodiments of the present invention are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present invention to facilitate understanding, and they should be regarded as exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.

[0037] figure 1 It is a schematic diagram of the picking operation scene in the robot box according to the embodiment of the present invention. Such as figure 1As shown, in the embodiment of the present invention, the camera is pre-installed above the turnover box of the item to be grabbed, and includes a 2D (two-dimensional) camera and a 3D (three-dimensional) camera. T...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a sequential grabbing method and device for stacked articles, and relates to the technical field of computers. One specific embodiment of the method comprises the following steps: segmenting point cloud of an article on the uppermost layer from point cloud data of stacked articles, and acquiring three-dimensional coordinates of the segmented point cloud; for the segmented point cloud, mapping three-dimensional coordinates of the segmented point cloud into two-dimensional coordinates on a two-dimensional image of the object by utilizing a calibration relationship betweena two-dimensional camera and a three-dimensional camera; digging out a corresponding area from the two-dimensional image of the article according to the two-dimensional coordinates, and identifying the article in the area; and acquiring three-dimensional coordinates of the corresponding point cloud according to the identified two-dimensional coordinates of the article, and calculating the identified central position of the article and the grabbing posture of the tooling according to the acquired three-dimensional coordinates of the point cloud so as to grab the article. According to the method, article identification and article position determination can be accurately carried out, so that article grabbing is more accurate, convenient and efficient, and the collision probability during article grabbing is reduced.

Description

technical field [0001] The invention relates to the technical field of computers, in particular to a method and device for sequentially picking up stacked items. Background technique [0002] The picking task in the robot box refers to the scene where the robot takes out the corresponding number of items from the turnover box according to the requirements according to the picking task issued by the system on the basis of visual assistance, and puts them into the designated location. One of the difficulties in this scene is how to ensure that the items are grabbed layer by layer from high to low, so as to prevent the end picker from colliding with other items in the box. [0003] In practice, a large number of items use square packaging boxes, such as food, 3C, and so on. Generally, multiple items of the same type are densely arranged in a turnover box. For the in-box picking tasks in such situations, the current machine vision technologies mainly include the following two ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/10G06T7/70
CPCG06T7/10G06T7/70G06T2207/10028
Inventor 刘伟峰万保成曹凯
Owner BEIJING JINGDONG QIANSHITECHNOLOGY CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products