Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual emotion label distribution prediction method based on automatic estimation

A technology of distribution prediction and emotion, applied in the field of deep convolutional neural network, can solve problems such as not considering label connection, and achieve the effect of a good model

Active Publication Date: 2019-08-30
NANKAI UNIV
View PDF10 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the CPNN-based method is only designed as a three-layer neural network classifier, and the input is the ready-made features
This approach is suboptimal because the extracted features do not take into account the relationship between labels

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual emotion label distribution prediction method based on automatic estimation
  • Visual emotion label distribution prediction method based on automatic estimation
  • Visual emotion label distribution prediction method based on automatic estimation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] refer to figure 1 , which represents the flow chart of the method of jointly learning visual emotion classification and distribution through a deep convolutional neural network. The steps shown in the figure are:

[0032] a. The pictures are resized, data enhanced and other operations are sent to the model, and the original model is pre-trained on the large-scale dataset ImageNet.

[0033] b. For single-label training data, use two kinds of weak prior knowledge to generate multi-label label distribution. The two prior knowledge and calculation principles are:

[0034] (1) Inference principle: the distance between two emotions can be measured by the Mikel wheel, and the probability value of the related category is calculated with the help of the Gaussian function. The probability value of the category closer to the original label is larger, otherwise it is smaller, so that it can be Get the multi-label distribution of the picture;

[0035](2) The principle of mutual ex...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a visual emotion label distribution prediction method based on automatic estimation, and belongs to the technical field of computer vision. According to the method, the problemof fuzziness existing in visual emotion is solved through label distribution learning, and meanwhile a depth framework capable of conducting emotion label classification and label distribution prediction at the same time is provided. In addition, as most of the visual emotion data sets only provide a single category label, in order to improve the practicability of the framework, weak priori knowledge, namely similarity information among the labels, is utilized, and corresponding emotion label distribution is generated according to emotion categories. In the process of framework learning, forclassification tasks, a Softmax function is used for constraint, and for distributed tasks, Kullback-Leibler (KL) loss is used for constraint. And the two loss weighting values are added to obtain a final loss function, and end-to-end label distribution prediction of the frame is realized.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and relates to a deep convolutional neural network method that can simultaneously solve the problem of visual emotion classification and label distribution prediction. Background technique [0002] Visual sentiment classification work can be roughly divided into two categories: dimension-based methods and classification-based methods. Dimension-based methods represent emotions in 2D or 3D space, while classification-based methods can map emotions to independent categories. In 2010, Machajdik and Hanbury defined a set of low-level features for visual sentiment analysis problems based on aesthetic and psychological theories in Document 1, including composition, color, texture, etc. In 2014, Zhao et al. introduced more robust and invariant visual features based on artistic principles in Document 2, but these manual features are only effective for some small data sets selected from a specifi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/24G06F18/214
Inventor 杨巨峰折栋宇姚星旭孙明
Owner NANKAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products