Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Article classification and recovery method based on multi-modal active perception

A multi-modal, item technology, applied in the direction of manufacturing tools, chucks, manipulators, etc., can solve problems such as being unsuitable for tactile data measurement, failure of tactile material identification, and inability to output the direction of the manipulator grasping items, etc.

Active Publication Date: 2020-08-28
北京具身智能科技有限公司
View PDF6 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Researchers from MIT and Princeton published a paper entitled Robotic Pick-and-Place of Novel Objects in Clutter with Multi-Affordance Grasping and Cross-Domain Image Matching (by generating grabbing point heat maps) at ICRA (International Conference on Robotics and Automation) in 2018. Realize the robot picks and places unknown items in the accumulation scene). This technology uses deep learning training to propose the Affordance network, which inputs the color depth image of the picking scene and outputs the pixel-level AffordanceMap (grabbing point heat map), which avoids Complicated item segmentation and recognition can directly obtain the alternative picking positions, but the network can only output pixel-level grabbing points, not the direction in which the manipulator grabs the items, so the captured pose may not be suitable for the measurement of tactile data , resulting in the failure of tactile material identification, and the existing methods cannot effectively solve this problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Article classification and recovery method based on multi-modal active perception
  • Article classification and recovery method based on multi-modal active perception
  • Article classification and recovery method based on multi-modal active perception

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] The article classification and recycling method based on multi-modal active perception proposed by the present invention, its flow chart is as follows figure 1 As shown, the specific steps are as follows:

[0053] (1) Build a system such as figure 2 Actual robotic item sorting and recycling operating system shown:

[0054] Including: a mechanical arm 1 (the present embodiment is UniversalRobot 5), a manipulator 2 (such as the CobotCohand212 model) that includes a suction cup, a color depth camera 3 (the present embodiment is the KinectV2 camera), a tactile sensor 6 (the present embodiment An example is a 5×5 piezoresistive flexible tactile sensor array, which can be a conventional model), an operating table 5 that can place items 4, and an item recovery container 7, a color depth camera 3, a tactile sensor 6, a manipulator 2 and Mechanical arm 1 links to each other with controller; In the embodiment of the present invention, controller is notebook computer;

[0055]...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an article classification and recovery method based on multi-modal active perception, and belongs to the technical field of robot application. The method comprises the steps that firstly, a target detection network model facing a target article is built, then a grabbing pose for grabbing the target article is obtained, and a mechanical arm system is guided to actively grabthe target article in a pinching mode according to the grabbing pose; the tail ends of fingers of a mechanical arm are provided with touch sensors, and touch signals on the surface of the target article can be obtained in real time while the target article is grabbed; and feature extraction is conducted on the obtained touch information, the feature information is input into a touch classifier for identifying the material of the article, and classification and recovery of the target article are completed. According to the article classification and recovery method based on multi-modal activeperception, visual and touch multi-modal information is utilized, a robot is guided to actively grab the target article in the most suitable pose through a visual detection result and collect the touch information, article material identification is achieved, and article classification and recovery are completed; and various recyclable articles made of different materials can be automatically identified, and high universality and practical significance are achieved.

Description

technical field [0001] The invention relates to an article classification and recycling method based on multimodal active perception, and belongs to the technical field of robot applications. Background technique [0002] With the continuous growth and urbanization of the global population, in order to achieve the goals of sustainable development and resource recovery, it is necessary to use robots to automatically and efficiently complete the task of sorting and recycling items. A common way of recycling is to collect recyclables in a mixed manner, regardless of the material information of the items. This recycling method can facilitate residents and reduce recycling costs, but subsequent operations are required to sort and recycle mixed items according to their materials. In the robot application, the visual mode can be used to detect the target item by using the visual deep learning algorithm, and then generate the grasping pose of the target item, and guide the manipula...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J13/08B25J19/02B25J15/06
CPCB25J13/084B25J19/023B25J15/0616
Inventor 郭迪刘华平袁小虎尹建芹
Owner 北京具身智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products