Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Man-machine interaction method and device based on emotion system, and man-machine interaction system

A human-computer interaction and emotional technology, applied in the field of intelligent services, can solve the problem of not being able to simultaneously recognize the emotional characteristics of voice, expression and body movements input by the user, so as to achieve smooth human-computer interaction process, improve the success rate, and increase the amount of data Effect

Inactive Publication Date: 2016-07-06
BEIJING GUANGNIAN WUXIAN SCI & TECH
View PDF10 Cites 52 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] One of the purposes of the present invention is to solve the technical defect that the existing man-machine question answering system cannot recognize the emotional features in the voice, expression and body movements input by the user at the same time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Man-machine interaction method and device based on emotion system, and man-machine interaction system
  • Man-machine interaction method and device based on emotion system, and man-machine interaction system
  • Man-machine interaction method and device based on emotion system, and man-machine interaction system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0058] This embodiment provides a human-computer interaction method, the steps of which are as follows figure 1 shown. The following combination figure 1 The emotion recognition method of this embodiment will be described in detail.

[0059] Firstly, in step S101, speech emotion parameters, expression emotion parameters and body emotion parameters are collected.

[0060] Subsequently, step S102 is performed to calculate and obtain the undetermined voice emotion according to the voice emotion parameter, and select the one closest to the undetermined voice emotion from the preset voice emotion as the voice emotion component; execute step S103 to calculate and obtain the undetermined voice emotion according to the expression emotion parameter. Emotional emotion, select the one closest to the undetermined facial emotion from the preset emotional emotions as the emotional emotion component of the facial expression; execute step S104, calculate and obtain the undetermined physical e...

Embodiment 2

[0109] This embodiment provides a human-computer interaction device 600, the structure of which is as follows Image 6 shown. The device includes a parameter acquisition unit 610 , a speech emotion recognition unit 620 , an expression emotion recognition unit 630 , a body emotion recognition unit 640 , a fusion unit 650 and a feedback unit 660 .

[0110] Among them, the parameter collection unit 610 is used to collect speech emotion parameters, expression emotion parameters and body emotion parameters.

[0111] The speech emotion recognition unit 620 is configured to calculate and obtain the pending speech emotion according to the speech emotion parameter, and select the one closest to the pending speech emotion from the preset speech emotions as the speech emotion component.

[0112] The expression emotion recognition unit 630 is used to calculate and obtain the undetermined expression emotion according to the expression emotion parameters, and select the one closest to the ...

Embodiment 3

[0117] This embodiment provides a human-computer interaction system. like Figure 7 As shown, the system includes a speech sensing device 710 , a visual sensing device 720 , a human-computer interaction device 600 , an output drive device 730 , an expression unit 740 , a speech unit 750 and an action unit 760 .

[0118] Wherein, the voice sensing device 710 is an audio sensor such as a microphone, which is used to collect voice information and input it to the human-computer interaction device 600 . The visual sensing device 720 is such as a camera, which is used to collect expression information and body information and input them to the human-computer interaction device 600 .

[0119] The structure of the human-computer interaction device 600 is as described in the second embodiment, and will not be repeated here. The output driving device 730 drives the expression unit 740 , the speech unit 750 and / or the action unit 760 to perform actions according to the multimodal feedb...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a man-machine interaction method and device based on an emotion system, and a man-machine interaction system. The method comprises following steps of collecting voice emotion parameters, expression emotion parameters and body emotion parameters; calculating to obtain a to-be-determined voice emotion according to the voice emotion parameters; selecting a voice emotion most proximate to the to-be-determined voice emotion from preset voice emotions as a voice emotion component; calculating to obtain a to-be-determined expression emotion according to the expression emotion parameters; selecting an expression emotion most proximate to the to-be-determined expression emotion from preset expression emotions as an expression emotion component; calculating to obtain a to-be-determined body emotion according to the body emotion parameters; selecting a body emotion most proximate to the to-be-determined body emotion from preset body emotions as a body emotion component; fusing the voice emotion component, the expression emotion component and the body emotion component, thus determining an emotion identification result; and outputting multi-mode feedback information specific to the emotion identification result. According to the method, the device and the system, the man-machine interaction process is more smooth and natural.

Description

technical field [0001] The present invention relates to the technical field of intelligent services, in particular to a human-computer interaction method, device and interactive system based on an emotion system. Background technique [0002] The intelligent question-answering robot belongs to the intersection of artificial intelligence and natural language processing. It can communicate with users through natural language and display the robot's emotions through expressions and actions. Emotion is a person's experience and attitude towards whether objective things meet his needs, and it is an important information transmitted in the process of man-machine dialogue. In the process of human-computer interaction, emotion recognition is an important part of the interactive system. Emotional state affects the way information is expressed and the effect of information transmission. [0003] The modalities of emotional expression include speech, text, facial expressions and body...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01
CPCG06F3/011
Inventor 刘佳亮
Owner BEIJING GUANGNIAN WUXIAN SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products