Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human-machine interactive voice control method and device based on user emotional state, and vehicle

A technology of voice control and human-computer interaction, applied in voice analysis, voice recognition, instruments, etc., can solve problems that affect driving safety and have not yet been popularized, and achieve the effect of driving safety

Active Publication Date: 2017-06-06
ZHICHEAUTO TECH BEIJING
View PDF7 Cites 49 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] With the rapid development of society, cars are becoming more and more popular in life; although the concept of vehicle automatic driving has been proposed for a long time, it has not yet been popularized; at present, the driver's control is still in a decisive position during vehicle driving
However, a driver may be affected by various emotions during driving, and some emotions may seriously affect driving safety

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-machine interactive voice control method and device based on user emotional state, and vehicle
  • Human-machine interactive voice control method and device based on user emotional state, and vehicle
  • Human-machine interactive voice control method and device based on user emotional state, and vehicle

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The present invention will be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments of the invention are illustrated.

[0037] figure 1 Show a flow chart of the human-computer interaction voice control method based on the user's emotional state according to an embodiment of the present invention, refer to figure 1 As shown, the method includes:

[0038] Step 101, monitor and set user's expression, voice or action.

[0039] In one embodiment, a combination of various sensors can be used to monitor or detect the user's expression, voice or action.

[0040] For example, the vehicle's built-in fatigue driving camera can be used to monitor the user's expressions, actions, etc.; the vehicle's built-in microphone can be used to detect and set the user's voice situation.

[0041] Step 102: Determine the current emotional state of the set user according to the set user's expression, voice or action.

[0042] In one embod...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human-machine interactive voice control method and device based on a user emotional state, and a vehicle, wherein the method comprises: monitoring the expression, the voice or the motion of a set user; determining the current emotional state of the set user based on the expression, the voice or the action of the set user; determining the voice control mode of the vehicle according to the current emotional state of the set user; and performing the vehicle human-machine interaction according to the determined voice control mode. The method, the device and the vehicle can calculate the current emotion of the user according to the driving behavior, the speech speed and tone, and the facial expression of the user. An intelligent system can play appropriate music or adjust the navigation voice change according to the current emotional state of the user in order to achieve human-machine interaction with the user, thereby adjusting the user's emotion to achieve safer driving.

Description

technical field [0001] The present invention relates to the field of artificial intelligence, specifically to the field of vehicle intelligent control or the field of human-computer interaction, and in particular to a human-computer interaction voice control method, device and vehicle based on the user's emotional state. Background technique [0002] With the rapid development of society, cars are becoming more and more popular in life; although the concept of vehicle automatic driving has been proposed for a long time, it has not yet been popularized; at present, the driver's control is still in a decisive position during vehicle driving. However, a driver may be affected by various emotions during driving, and some emotions may seriously affect driving safety. [0003] Therefore, it is necessary to provide a method or a vehicle capable of analyzing the driver's emotion. Contents of the invention [0004] A technical problem to be solved by the present invention is how t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G10L15/22G10L15/08G10L25/63
CPCG10L15/08G10L15/22G10L25/63G10L2015/088G10L2015/227
Inventor 沈海寅
Owner ZHICHEAUTO TECH BEIJING
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products