Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Grabbed object recognition method based on tactile vibration signal and visual image fusion

A visual image and vibration signal technology, applied in neural learning methods, character and pattern recognition, biological neural network models, etc., can solve problems such as ignorance of relevant characteristics, inconspicuous features, loss of key information, etc., and achieve optimal recognition accuracy, Improves the effect of cognitive ability, good kind and physical attributes

Active Publication Date: 2021-02-23
QILU UNIV OF TECH
View PDF11 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] 1. The information of different modalities uses different neural networks to extract features, and finally obtains independent features of different modalities. Since the process of feature extraction is independent of each other, some correlation characteristics between different modal data are ignored. For example, there is a certain relationship between the rough surface of the image and the micro-vibration on the touch. When using the previous method to extract features independently of vision and touch, the surface roughness may not be extracted in the respective feature vectors because the features are not prominent. information or small vibrations, which will lead to the loss of some key information
[0006] 2. Different modalities use different networks to extract features, which leads to a huge amount of network parameters in the whole method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Grabbed object recognition method based on tactile vibration signal and visual image fusion
  • Grabbed object recognition method based on tactile vibration signal and visual image fusion
  • Grabbed object recognition method based on tactile vibration signal and visual image fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the drawings in the embodiments of the present invention. Apparently, the described embodiments are only some of the embodiments of the present invention, not all of them. The components of the embodiments of the invention generally described and illustrated in the figures herein may be arranged and designed in a variety of different configurations.

[0034] Accordingly, the following detailed description of the embodiments of the invention provided in the accompanying drawings is not intended to limit the scope of the claimed invention, but merely represents selected embodiments of the invention. Based on the embodiments of the present invention, all other embodiments obtained by those skilled in the art without making creative efforts belong to the protection scope of the present invention.

[0035] It should be noted that like nume...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of robot grabbing cognition, in particular to a grabbed object recognition method based on tactile vibration signal and visual image fusion. The recognition method comprises the following steps of firstly, obtaining a visual color image of a grabbed object through a camera, and then converting the tactile vibration signal obtained during grabbing of the object into the tactile color image according to the numerical value; finally, converting the sizes of the visual color image and the tactile color image to obtain a unified size, and combining the RGB three channels of the visual color image and the RGB three channels of the tactile color image to form six-channel input; and at last, inputting the six-channel image into a convolutional neural network forobject type recognition. According to the recognition method, the types and physical properties of the object can be recognized properly, and the cognitive ability of the robot can be improved.

Description

technical field [0001] The invention relates to the technical fields of robot grasping cognition and multi-modal data fusion, in particular to a grasping object recognition method based on the fusion of tactile vibration signals and visual images. Background technique [0002] At present, the vast majority of robots' cognition process of the outside world is completed through the image information captured by the camera, and more than 70% of the information in the process of human interaction with the outside world also comes from vision. But the sense of touch also plays an important role in human grasping behavior, especially in the identification of some real and fake objects, the sense of touch has an irreplaceable advantage of vision. For example, it is difficult to distinguish between simulated plastic fruits and real fruits at the visual level, but it is easy to draw conclusions through touch. However, the current tactile technology is not yet mature enough. It can o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): B25J11/00B25J9/16B25J13/08B25J15/08B25J18/00B25J19/02G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCB25J11/00B25J9/1697B25J13/087B25J15/08B25J18/00B25J19/023G06N3/08G06V20/10G06V10/56G06N3/045G06F18/24
Inventor 张鹏周茂辉单东日王晓芳于国奇
Owner QILU UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products