Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Action simulation interaction method and device for intelligent equipment and intelligent equipment

An intelligent device, motion simulation technology, applied in the input/output of user/computer interaction, mechanical mode conversion, character and pattern recognition, etc., can solve the problems of no motion simulation process, low information dissemination efficiency, limited information, etc. To achieve the effect of vivid interactive form, accurate information interaction, and improved accuracy

Active Publication Date: 2019-06-21
北京如布科技有限公司
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It leads to low efficiency of information dissemination, lack of vividness, no vivid action simulation process, and limited amount of information expressed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action simulation interaction method and device for intelligent equipment and intelligent equipment
  • Action simulation interaction method and device for intelligent equipment and intelligent equipment
  • Action simulation interaction method and device for intelligent equipment and intelligent equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0120] The principle and steps of the action simulation interaction method for smart devices of the present invention will be described below with reference to Embodiment 1.

[0121] Step 1: Analyze and extract the selected action samples to obtain corresponding action instructions

[0122] First, search the action videos of various animals through the Internet, or use the camera of the intelligent robot to capture the action videos of animals in real time. Action videos of various animals may include action videos of cats, action videos of dogs, and the like. For action videos, you can also customize more detailed action samples, such as cats sleeping, cats walking, dogs barking, etc. These action videos serve as action samples to be imitated.

[0123] Then, feature extraction is performed on the acquired action samples frame by frame, and recognition is performed based on the extracted features to obtain action instructions corresponding to the action samples. Then, split...

Embodiment 2

[0187] The principle and steps of the action simulation interaction method for smart devices of the present invention will be further described with reference to Embodiment 2 below.

[0188] In this embodiment, steps 1 and 2 are the same as in embodiment 1.

[0189] In step 3, the voice command issued by the user is "Imitate a dog barking", and the voice command is converted into the target text as "Imitate a dog barking". The verb phrase (that is, "imitate") and the object (that is, "a dog's barking") in the target text can be obtained through semantic analysis. Therefore, the target event corresponding to the target text is "imitating a dog's barking".

[0190] In step 4, since the event matching model established in step 2 cannot identify "dogs", the probability for all animals is 0:

[0191] p(dog barking|dog)=0, p(dog barking|tiger)=0, p(dog barking|cat)=0

[0192] In this case, the smart device asks the user: "What kind of animal is a dog?", and the user replies: "Dog"...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An action simulation interaction method and device for an intelligent device and an intelligent device, the method includes: analyzing and extracting selected action samples, and obtaining corresponding action instructions, the action instructions include sub-action instructions and action scheduling instructions; The instruction is trained, and an event matching model is established; the received speech signal is converted into target text, the target text is semantically analyzed, and the target event corresponding to the target text is determined; based on the event matching model, the corresponding target action is obtained according to the target event; call The action scheduling instruction and the sub-action instruction corresponding to the target action are used to drive the corresponding hardware components of the intelligent device according to the action scheduling instruction and the sub-action instruction.

Description

technical field [0001] The present disclosure relates to the field of smart devices, and in particular, to an action simulation interaction method and device for smart devices and a smart device including the action simulation interaction device. Background technique [0002] The interaction between traditional smart devices and users is limited to text, sound and images, and general smart devices are limited to accepting input in the form of text and output in the form of sound, text or images. For example, to query content such as animal encyclopedias and cartoon images on traditional smart devices, users need to input text for memory query and obtain output results in the form of sound, text, and images. [0003] Most of the newer smart devices have network modules, and the accepted input forms also extend to sound. After receiving the user's voice question, the smart device can upload the user's voice question data through the network module, and after performing voice ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/011G06V40/20
Inventor 吴芷莹叶菲梓郭祥
Owner 北京如布科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products