Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-modal data processing method applied to robot interaction

A data processing and multi-modal technology, applied in the field of robotics, can solve the problem of low data processing accuracy, achieve data perception, ensure the rationality of actions, and improve the effect of accuracy

Pending Publication Date: 2022-01-07
GUANGDONG ARTIFICIAL INTELLIGENCE & DIGITAL ECONOMY LAB (GUANGZHOU) +1
View PDF6 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention provides a multi-modal data processing method, device, equipment and storage medium applied to robot interaction to solve the technical problem that existing robots only rely on visual data, resulting in low data processing accuracy. Multi-modal fusion of data and tactile data improves the accuracy of data processing and promotes the intelligentization of robots

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-modal data processing method applied to robot interaction
  • Multi-modal data processing method applied to robot interaction
  • Multi-modal data processing method applied to robot interaction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0043] In the description of the present application, the terms "first", "second", "third" and so on are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, a feature defined as "first", "second", "third", etc. may expressly or implicitly include one or more of that feature. In the description of the present application, unless otherwi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-modal data processing method applied to robot interaction, and the method comprises the steps that target visual information data are obtained, and target touch information data are obtained; based on a multi-modal data fusion model, fusion processing is conducted on the target visual information data and the target tactile information data to obtain fused instruction information data; the multi-modal data fusion model is obtained by training visual information sample data and tactile information sample data which can reflect robot action instructions; and the instruction information data are recognized, and the instruction information data are output to an action component of a robot associated with the instruction information data. According to the multi-modal data processing method applied to robot interaction, the visual data and the touch data are subjected to multi-modal fusion, the data processing precision is improved, and the intelligent process of the robot is promoted.

Description

technical field [0001] The present invention relates to the technical field of robots, in particular to a multimodal data processing method, device, equipment and storage medium applied to robot interaction. Background technique [0002] With the continuous development of artificial intelligence technology, smart home robots rely on flexible mechanical claws and high-sensitivity sensors, which can not only recognize three-dimensional objects, but also perform various complex actions, and are gradually favored by consumers. [0003] In order to realize the information interaction between "people and things" and then achieve the intelligent control of home robots, how to process the received data is particularly important. In the existing technology, it mainly relies on image recognition and neural networks and other related Technology, by inputting the image signal collected by the camera into the trained neural network model, the data that can reflect the needs of the target...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16G06K9/62G06N3/04G06N3/08
CPCB25J9/1694B25J9/1697B25J9/1602B25J9/1664G06N3/08G06N3/045G06F18/253
Inventor 石光明张凡李旭阳谢雪梅
Owner GUANGDONG ARTIFICIAL INTELLIGENCE & DIGITAL ECONOMY LAB (GUANGZHOU)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products