Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Pushing and grabbing collaborative sorting network based on double viewing angles and sorting method and system thereof

A dual-view and network technology, applied in sorting, biological neural network models, instruments, etc., can solve problems such as poor generalization ability and low capture success rate, achieve high capture success rate, improve perception ability, and avoid The effect of missing object information

Inactive Publication Date: 2020-09-11
HUAZHONG UNIV OF SCI & TECH
View PDF4 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the above defects or improvement needs of the prior art, the present invention provides a dual-view based push-and-grab collaborative sorting network and its sorting method and system, thereby solving the problem of grasping when faced with messy stacking scenes in the prior art. Technical issues with low success rate and poor generalization ability

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pushing and grabbing collaborative sorting network based on double viewing angles and sorting method and system thereof
  • Pushing and grabbing collaborative sorting network based on double viewing angles and sorting method and system thereof
  • Pushing and grabbing collaborative sorting network based on double viewing angles and sorting method and system thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0074] Step 1, for such as figure 2 In the messy stacking scene shown, two binocular cameras are used to collect point cloud images of the object scene to be sorted from two perspectives. Due to the point cloud image obtained by shooting the scene from a single perspective, some object information will be missing, such as image 3 (a) Neighboring objects in a single perspective, image 3 (b) Stacked objects from a single perspective. Therefore, the present invention adopts two viewing angles to obtain object information, and can obtain more complete object information.

[0075] Step 2. Perform top view projection on the point cloud image obtained in step 1 to obtain top views under two viewing angles, such as Figure 4 (a) and 4(b).

[0076] Step 3. Rotate the top views under the two viewing angles obtained in step 2 respectively, and rotate once every 22.5° to obtain 16 rotated images respectively. A total of 32 rotated images under the two viewing angles can be obtained...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a pushing and grabbing collaborative sorting network based on double viewing angles and a sorting method and system thereof. The trained pushing and grabbing collaborative sorting network comprises a pushing full convolutional network and a grabbing full convolutional network, and the network is applied to robot pushing and grabbing collaborative sorting. The sorting methodcomprises the following steps of correspondingly acquiring point cloud graphs of an object scene to be sorted from two viewing angles, rotating a top view of the point cloud graphs, correspondingly inputting a plurality of rotating images into the pushing full convolutional network and the grabbing full convolutional network to obtain two thermodynamic graphs with Q values output by the networks,and selecting the thermodynamic diagram with the larger Q value as a final thermodynamic diagram; and according to the pixel point corresponding to the maximum Q value in the thermodynamic diagram and the rotation angle of the rotation image corresponding to the thermodynamic diagram, controlling the robot to execute the sorting action of the network corresponding to the thermodynamic diagram, and then completing sorting. According to the sorting method, double viewing angles are combined with deep Q learning, so that the grabbing success rate is high and the generalization ability is high inthe face of a disordered stacking scene.

Description

technical field [0001] The invention belongs to the technical field of robot application, and more specifically relates to a push-and-grab collaborative sorting network based on dual perspectives and a sorting method and system thereof. Background technique [0002] Object grasping is the main means of robot sorting operation. The level of robot grasping operation ability directly determines the efficiency of sorting. Robust and efficient sorting of objects is a current research hotspot in robotics. Most of the current object capture methods are aimed at non-stacked object scenes, using cameras to capture scene pictures, and combining traditional image processing methods and machine learning methods to segment and recognize objects. However, as industrial application scenarios become more and more complex, there are often situations that are not conducive to sorting, such as multiple target objects, arbitrary placement of object poses, mutual contact and occlusion between mu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B07C5/36G06T3/60G06N3/04
CPCB07C5/361B07C5/362G06T3/60B07C2501/0063G06N3/045
Inventor 彭刚廖金虎
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products