Multi-modal emotion interaction method, intelligent equipment, system, electronic equipment and medium

An interactive method and multi-modal technology, applied in the field of human-computer emotional interaction, which can solve the problems of inconvenient user upgrade and maintenance, limited nursing function, and inability to correctly understand the user's emotional state.

Active Publication Date: 2021-02-19
NINGBO UNIV
View PDF8 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although these care robots can provide some care and care for the daily life of the elderly, they are expensive and provide very limited care functions, and it is not convenient for users (such as the elderly) to upgrade and maintain
In addition, these care robots often lack multi-modal emotional interaction capabilities, it is difficult to establish natural emotional communication with users, and they cannot correctly understand the real emotional state of users

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-modal emotion interaction method, intelligent equipment, system, electronic equipment and medium
  • Multi-modal emotion interaction method, intelligent equipment, system, electronic equipment and medium
  • Multi-modal emotion interaction method, intelligent equipment, system, electronic equipment and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0077] The present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0078] see figure 1 As shown, this embodiment provides a multi-modal emotional interaction method to realize emotional interaction with users. Specifically, the multi-modal emotional interaction method of this embodiment includes the following steps 1-4:

[0079] Step 1, determine the multi-modal input parameter set that characterizes the user's emotion; wherein, the multi-modal input parameter set includes at least facial expression information, voice dialogue text information and body movement information that characterize the user's emotion; in this embodiment, characterize the user The emotional facial expression information includes six facial expressions of joy, fear, surprise, sadness, disgust and anger; the voice dialogue text information representing the user's emotion can be the voice containing emotional vocabulary or the voice con...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a multi-modal emotion interaction method, intelligent equipment, a system, electronic equipment and a medium. The multi-modal emotion interaction method comprises the following steps: determining a multi-modal input parameter set for representing the emotion of a user, determining the current emotion state of the user according to the multi-modal input parameter set, and determining behavior performance fed back to the user according to the current emotion state of the user and a preset emotion response library; and feeding back the determined behavior representation to the user by the hardware device or the virtual agent. According to the multi-modal emotion interaction method, multiple modal parameters representing the emotion of the user are collected through multiple channels, and then the current facial emotion of the user is comprehensively and accurately sensed through fusion processing of the collected modal parameters; and the defects of misjudgment and the like when the traditional emotion interaction method only identifies the emotion of the user based on single channels such as vision are avoided.

Description

technical field [0001] The invention relates to the field of human-computer emotional interaction, in particular to a multi-modal emotional interaction method, intelligent equipment, system, electronic equipment and media. Background technique [0002] At present, the globalization trend of population aging is very obvious. With the deepening of population aging, there are more and more "empty nest elderly" families. These elderly people not only face the situation of no one to take care of them in their daily life, but also face the dilemma of no one to talk about their emotions. Seriously affect the physical and mental health of the elderly. [0003] In order to solve the problem that the elderly have no one to take care of them, care robots for the elderly have appeared on the market. Although these nursing robots can provide some care and care for the daily life of the elderly, the price is relatively high and the nursing functions provided are very limited, and it is ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06F16/332G06K9/00G06K9/62
CPCG06F3/011G06F3/017G06F16/3329G06V40/174G06V40/20G06F18/24G06F18/214
Inventor 刘箴刘婷婷柴艳杰
Owner NINGBO UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products