Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Mechanical arm active grabbing device and method based on multi-model fusion

A technology for grasping devices and manipulators, applied in manipulators, program-controlled manipulators, manufacturing tools, etc., can solve the problems of lack of automatic real-time interaction and learning process, difficulty in grasping, sensor disturbance, etc., and achieve improved positioning and active grasping ability, avoid strong light interference, and improve the effect of grasping success rate

Active Publication Date: 2019-01-04
SHANGHAI JIAO TONG UNIV
View PDF5 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Traditional space activities rely on equipment preset instructions, direct operation by space station staff or remote operation by ground staff, lack of automatic real-time interaction and learning process with the environment, making it difficult to achieve complex operations such as grasping moving objects in a microgravity environment Task
The existing research on automatic grasping of moving objects in a microgravity environment mainly focuses on tactile perception combined with passive compliance mechanisms to overcome the impact of moving objects in the process of grasping to improve the success rate and reliability of grasping. There are few studies on the fusion of tactile, visual and other multi-modal information to realize the active grasping operation of manipulators. The correlation and complementarity between modal sensor information is of great significance to improve the grasping efficiency and robustness

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mechanical arm active grabbing device and method based on multi-model fusion
  • Mechanical arm active grabbing device and method based on multi-model fusion
  • Mechanical arm active grabbing device and method based on multi-model fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The present invention will be described in detail below in conjunction with specific embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that those skilled in the art can make several changes and improvements without departing from the concept of the present invention. These all belong to the protection scope of the present invention.

[0035] The invention aims at the problem that the binocular vision system is difficult to accurately obtain the information of the moving object to be grasped due to environmental factors such as harsh light and electromagnetic fields in space. By introducing the laser radar to monitor the surrounding objects in the microgravity environment in real time, and through the cyclic neural network-long short-term memory network The algorithm is the RNN-LSTM algorithm that fuses radar images and visual images, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a mechanical arm active grabbing device and method based on multi-model fusion. The mechanical arm active grabbing device based on multi-model fusion comprises a base (1), a mechanical arm (2), a laser radar (3), a binocular vision system (4) and a mechanical arm (5), wherein the laser radar (3) and one end of the mechanical arm (2) are respectively fixedly mounted on the base (1), and the binocular vision system (4) and the mechanical arm (5) are respectively fixedly mounted at the other end of the mechanical arm. The mechanical arm active grabbing method based on multi-model fusion comprises the following steps: step 1, sensing an object to be grabbed to obtain sensing information; step 2, positioning the object to be grabbed according to the sensing information toobtain positioning information; and step 3: grabbing the object to be grabbed according to the positioning information. The mechanical arm active grabbing device and method based on multi-model fusion fully consider the complex environment of space operation, effectively improve the capability of moving object grabbing, and have a wide application prospect.

Description

technical field [0001] The present invention relates to the technical field of positioning and grasping of space robots, in particular to an active grasping device and method for manipulators based on multimodal fusion, especially a microgravity environment that integrates CMOS camera binocular vision, laser radar and tactile perception Robot localization and active grasping technology. Background technique [0002] At present, the aerospace field of major countries in the world is developing rapidly, and more and more life science experiments and space operations are being carried out to explore space. Traditional space activities rely on equipment preset instructions, direct operation by space station staff or remote operation by ground staff, lack of automatic real-time interaction and learning process with the environment, making it difficult to achieve complex operations such as grasping moving objects in a microgravity environment Task. The existing research on autom...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): B25J9/16
CPCB25J9/1679B25J9/1694B25J9/1697
Inventor 王伟明马进薛腾韩鸣朔刘文海潘震宇邵全全
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products