Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Expression input method and device based on face identification

An expression input and face recognition technology, applied in the field of input methods, can solve the problems of monotonous default expressions, many expression options, and complex selection interfaces, so as to reduce the constraints of development and widespread use, improve use efficiency, and reduce time costs. Effect

Active Publication Date: 2014-09-24
BEIJING SOGOU TECHNOLOGY DEVELOPMENT CO LTD
View PDF4 Cites 46 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] 1. Considering the operating cost of emoticons used by users, the emoji package producers will also streamline the content of emoticons as appropriate, which also restricts the development and widespread use of chat emoticons to some extent
[0008] 2. Most chat tools only provide default emoticons
The default emoticons are relatively monotonous, and more abundant and diversified themed chat emoticon resources can effectively improve the favorability of chatting with friends. However, in order to use these emoticons, users need to go through many online operation steps, obtain emoticon package information from various channels and Download the emoticon pack to the local, and sometimes it needs to be manually loaded to use the emoticon pack normally
For users who are not familiar with operations or who are not patient enough, the time and cost it takes to successfully obtain and install a suitable emoticon package from network resources may cause them to choose to give up
[0009] 3. For the downloaded emoticons, if the user switches the input scene such as the chat platform, the emoticons need to be downloaded or updated again, and the user's frequently used emoticon collection information also faces the problem of transplantation
[0010] 4. The user chooses the emoticon by himself, perhaps because the selection interface is too complicated and there are too many emoticon options, it is impossible to accurately select the emoticon that matches his current actual emoticon
If it is not specially sorted out, many multimedia resources such as photos of exaggerated expressions of celebrities and political figures, GIFs, etc. cannot be used as candidate expressions in a timely manner, which reduces the efficiency of user input

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Expression input method and device based on face identification
  • Expression input method and device based on face identification
  • Expression input method and device based on face identification

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0051] refer to figure 1 , which shows a schematic flowchart of an expression input method based on face recognition in the present invention.

[0052] In the embodiment of the present invention, the corresponding relationship between emotion tags and expressions in each topic and the facial expression recognition model will be constructed in advance.

[0053] The following describes the construction process of establishing the corresponding relationship between emotional tags and expressions in each topic:

[0054] Step S100, constructing the correspondence between the emotional tags and the expressions in each topic according to the collected chat resource data and the expression resource data of each topic.

[0055] In the present invention, the corresponding relationship between emotion tags and expressions in each topic can be obtained by collecting chat resource data and expression resource data of each topic, and using the chat resource data to analyze the expression r...

Embodiment 2

[0177] refer to Figure 5 , which shows a schematic flow chart showing an expression input method based on face recognition in the present invention. include:

[0178] Step 510, start the input method;

[0179]Step 520, judging whether the current input environment of the client input method requires emoticon input; if emoticon input is required, proceed to step 530; if not, proceed to traditional input mode.

[0180] That is, the input method recognizes the environment in which the user is typing. If it is a chatting environment, a webpage input, etc., where there is a high possibility of emoticon input, step 130 is executed. If not, directly receive the user's input sequence, perform word conversion to generate candidates and display them to the user.

[0181] Step 530, obtain the photo taken by the user;

[0182] When the user triggers the camera function during the input process, the embodiment of the present invention acquires the photos taken by the user.

[0183] ...

Embodiment 3

[0190] refer to Image 6 , which shows a schematic flow chart showing an expression input method based on face recognition in the present invention. include:

[0191] Step 610, the mobile client starts the input method;

[0192] Step 620, the mobile client judges whether the current input environment of the input method of the client needs emoticon input; if emoticon input is required, proceed to step 630; if not, proceed to traditional input mode.

[0193] Step 630, acquire the user's photo taken by the front camera of the mobile client, and transmit the photo to the cloud server.

[0194] Step 640, the cloud server adopts the facial expression recognition model to determine the emotion label corresponding to the facial expression in the photo;

[0195] Step 650, based on the correspondence between the emotion tags and the expressions in the topics, the cloud server respectively acquires the expressions of the topics corresponding to the emotion tags;

[0196] According t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an expression input method and device based on face identification, and relates to the technical field of input methods. The method comprises the steps of starting an input method, acquiring a shot picture of a user, determining emotion labels corresponding to the facial expression in the picture by a face expression identification model, acquiring the expressions of all themes of the emotion labels respectively based on the corresponding relation of the emotion labels and the expressions in all the themes, the expressions of all the themes are ordered and used as candidate items to be displayed in a client end. According to the expression input method and device, the labels can be directly identified and matched according to the currently-shot picture of the user, the user can input the expression conveniently, the expression accuracy is high, and rich and wide-range expression resources are provided for the user.

Description

technical field [0001] The invention relates to the technical field of input methods, in particular to an expression input method and device based on face recognition. Background technique [0002] The input method is an encoding method used to input various symbols into a computer or other equipment (such as a mobile phone). Common input methods include Sogou input method, Microsoft input method and so on. [0003] Traditional emoticon input generally has several situations: one is that the platform itself has an emoticon input module, such as the emoticon input module embedded in chat tools such as qq, which comes with default input emoticons, and third-party emoticon packages can also be installed, and users can also Custom image resources are used as emoticons. When the user enters the emoticon, click the input button of the emoticon and select the emoticon to input. However, this situation is completely separated from the input method. The user needs to click the emoti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06F3/023G06F17/30
Inventor 顾思宇刘华生张阔
Owner BEIJING SOGOU TECHNOLOGY DEVELOPMENT CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products