Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method for generating facial expression based on human facial expression by partitioning facial expression into facial expression elements

A technology of facial expression and movement, applied in the field of artificial intelligence interaction, can solve problems such as restricting the interaction performance of humanoid robots

Inactive Publication Date: 2018-12-28
DALIAN DOREAL SOFTWARE
View PDF6 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] There are many problems in the fidelity, flexibility, and diversity of humanoid facial expressions in the existing technical methods for realizing facial expressions of humanoid robots, which restrict the improvement of the interactive performance of humanoid robots and humans

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method for generating facial expression based on human facial expression by partitioning facial expression into facial expression elements
  • A method for generating facial expression based on human facial expression by partitioning facial expression into facial expression elements
  • A method for generating facial expression based on human facial expression by partitioning facial expression into facial expression elements

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0054] Taking smiling and saying "Hello" as an example, the method for generating human facial expressions based on partitioning elements, the specific steps are as follows:

[0055] Step 1: Collect the movement data of each part of the facial expressions of multiple objects, and establish a comprehensive facial expression action element database through sorting, feature analysis, evaluation, and element extraction; the facial integrated expression action element database is driven by human facial expression actions Muscle groups of the eyes, face, and mouth, collected by figure 1 Set 34 marker points at the position shown to collect facial comprehensive expression and movement data.

[0056] Specifically, a high-precision 3D facial motion capture system is used to collect motion data of various parts of facial expressions of multiple objects. The specific collection method is as follows: first collect the 3D coordinates A of 34 marker points on the face of a single object in ...

Embodiment 2

[0078] Taking disgust, rolling eyes and saying "Ah" as an example, the method for generating human facial expressions based on partitioning and factorization of facial expressions, the specific steps are as follows:

[0079] Step 1: Collect the movement data of each part of the facial expressions of multiple objects, and establish a comprehensive facial expression action element database through sorting, feature analysis, evaluation, and element extraction; the facial integrated expression action element database is driven by human facial expression actions Muscle groups of the eyes, face, and mouth, collected by figure 1 Set 34 marker points at the position shown to collect facial comprehensive expression and movement data.

[0080] Specifically, a high-precision 3D facial motion capture system is used to collect motion data of various parts of facial expressions of multiple objects. The specific collection method is as follows: first collect the 3D coordinates A of 34 marker p...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of artificial intelligence interaction and the technical field of human-simulated expression automatic generation, in particular to a method of expression action generation based on human facial expression partition essentialization. Based on the method of human facial expression generation, the motion data of each part of facial expression is collected, and thedatabase of facial comprehensive expression is established through sorting, feature analysis, evaluation and element extraction. According to the externally inputted expression inner emotion parameters, the expression inner emotion parameters corresponding to the expression inner emotion parameters in the facial comprehensive expression action element database are selected; the facial comprehensive expression action parameter L is taken as the final facial expression parameter R; the final facial expression parameter R is used to control the motion of the target model and generate the expression action. The invention fundamentally solves the fidelity, flexibility, richness, colorfulness and limitation of the facial expression in the interaction between the artificial intelligence and the human, and realizes the natural interaction between the artificial intelligence and the human.

Description

technical field [0001] The present invention relates to the field of artificial intelligence interaction and the technical field of automatic generation of imitation human expressions, in particular to a method for generating facial expressions based on the partitioning of human facial expressions. Background technique [0002] Human beings have all kinds of appearances, and their facial expressions are also extremely rich. Humans convey information about emotional changes through various specific facial expressions. No matter which country or region people are from, their internal and external reflections and changes in emotion are basically the same. In the field of artificial intelligence interaction, the automatic generation of human-like facial expressions has an important role and application requirements. Such as: the realization of humanoid facial expressions when humanoid robots interact with people, the humanoid expressions in virtual reality applications and the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06F3/01
CPCG06F3/011G06V40/165G06V40/176G06V40/171
Inventor 王春成刘鑫张冰郑媛媛王恒郑时雨
Owner DALIAN DOREAL SOFTWARE
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products