Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and system for data processing for robot action expression learning

A technology of data processing and data processing devices, which is applied in the field of intelligent robots, can solve problems such as the failure to develop robots, and achieve the effect of enriching communication forms and improving intelligence

Active Publication Date: 2016-08-03
BEIJING GUANGNIAN WUXIAN SCI & TECH
View PDF10 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] However, it is well known that the action-based communication allows robots to understand some human actions and imitate a wide range of knowledge, which requires the technical support of robot hardware, which is extremely interdisciplinary and extremely challenging.
Therefore, in fact, a kind of robot that can freely imitate the expression of human action has not been developed in the prior art.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for data processing for robot action expression learning
  • Method and system for data processing for robot action expression learning
  • Method and system for data processing for robot action expression learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0063] figure 2 This is an overall flow chart of motion imitation according to an embodiment of the present invention. The method begins with step S201, where the robot captures dynamic images in real time. For example, the target motion can be captured and recorded by adopting mechanical, acoustic, electromagnetic, optical, inertial navigation and other motion capture technologies.

[0064] Combine image processing, pattern recognition and other technologies to judge whether there is a human body in the captured image. In one embodiment, a human body detection algorithm based on HOG features may be used to capture a human body image, and then the image is normalized so that the person is basically located in the center of the entire image.

[0065] If there is no human body, the system continues to capture images. If there is a human body, in step S202, analyze the human body posture, for example, analyze the human body posture by using a human body posture detector.

[...

Embodiment 2

[0079] Such as Figure 5 As shown, therein is shown yet another specific embodiment of the idea according to the invention. In this figure, the method starts at step S101. In this step, a series of actions issued by the target within a period of time are captured and recorded. This step is still performed, for example, by optical sensing components such as a camera of the robot. Then, there is a lot of work in preprocessing the image as needed. For example, the human body can be accurately extracted in a complex background, and the foreground image of the human body can be obtained. In the present invention, the depth information is obtained based on stereo vision instead of monocular vision, and the three-dimensional human body posture is recovered from the image. This ensures the accuracy of the captured motion.

[0080] In addition, accurate extraction of human motion key frames is also required. When the robot captures the frame of the human action sequence through i...

Embodiment 3

[0108] The present invention also provides a data processing device for robot learning action expression, which includes:

[0109] Motion capture module, which is used to capture and record a series of actions issued by the target within a period of time;

[0110] An associated information identification and recording module, which is used to identify and record synchronously the information sets associated with the captured series of actions, the information sets are composed of information elements;

[0111] A sorting module, which is used to sort out the recorded actions and the information sets associated with them and store them in the memory bank of the robot according to the corresponding relationship;

[0112] Action imitation module, which is used for when the robot receives the action output instruction, calls the information set that matches the content to be expressed in the information set stored in the memory bank and makes an action corresponding to the informat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a data processing method for robot action expression learning. The method comprises the following steps: a series of actions made by a target in a period of time is captured and recorded; information sets correlated with the series of actions are recognized and recorded synchronously, wherein the information sets are composed of information elements; the recorded actions and the correlated information sets are sorted and are stored in a memory bank of the robot according to a corresponding relationship; when the robot receives an action output instruction, an information set matched with the expressed content in the information sets stored in the memory bank is called to perform an action corresponding to the information set, and a human action expression is simulated. Action expressions are correlated with other information related to language expressions, after simulation training, the robot can perform diverse output, the communication forms are rich and more humane, and the intelligent degree is enhanced more greatly.

Description

technical field [0001] The invention relates to the field of intelligent robots, in particular to a data processing method and system for robot learning action expression. Background technique [0002] Human-computer interaction refers to the interaction and communication between humans and machines. The ultimate goal is to make robots empathetic, able to understand and imitate human language and behavior, so that humans can interact with robots more effectively and naturally. However, human-to-human interaction depends largely on voice and vision, so the development of human-computer interaction is the development of voice interaction and visual interaction. [0003] There are many ways to communicate between humans and machines. The most ideal communication method is the combination of voice and action, because it is more similar to human-to-human communication. This communication method can enhance the user experience and increase the flexibility and effectiveness of com...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/00
CPCG06N3/008
Inventor 郭家
Owner BEIJING GUANGNIAN WUXIAN SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products