Depth ordered regression model-oriented generative feature ordered regularization method

A technology of regression model and regression method, which is applied in the direction of neural learning method, character and pattern recognition, biological neural network model, etc., can solve the problem of damaging the essential goal of deep order regression method, and achieve the effect of improving performance

Pending Publication Date: 2022-05-13
FUDAN UNIV
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, it is found experimentally that the feature representation learned from the input image contains too much information irrelevant to the ordered label, which seriously damages the essential goal of the existing deep ordered regression methods, which is to learn the one-dimensional ordered relation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth ordered regression model-oriented generative feature ordered regularization method
  • Depth ordered regression model-oriented generative feature ordered regularization method
  • Depth ordered regression model-oriented generative feature ordered regularization method

Examples

Experimental program
Comparison scheme
Effect test

experiment example 1

[0064] Experimental example 1: Quantization performance comparison in different scenarios

[0065] Table 1 Performance comparison of different methods in face age estimation

[0066]

[0067] Table 1 Performance comparison of different methods in medical image classification

[0068]

[0069] For quantitative comparisons, experimental comparisons are performed between some state-of-the-art ordinal regression methods and their combinations with our PnP-FOR. In Table 1, we mainly compare four baseline methods and their combinations with PnP-FOR: Mean-Variance [4], Poisson [9], SORD [6] and POE [5]. Notably, the proposed PnP-FOR improves over existing methods in terms of MAE, CS and accuracy. Mean-Variance and POE focus on learning the uncertainty of neural network output vectors and latent space features, respectively. Mean-Variance computes the mean loss and variance loss of the output vectors to control for uncertainty in the output space. PnP-FOR+Mean-Variance achie...

experiment example 2

[0075] Experimental example 2: Comparison of t-SNE visualization results of different methods

[0076] In order to further highlight the superiority of PnP-FOR, we have image 3 Feature representations of the four baseline methods with and without PnP-FOR are visualized by t-SNE in . It can be seen that the traditional OR method shows a certain ordered distribution in the feature space, while POE learns a relatively regular ordered distribution. Although their learning strategies can model ordered relations in latent spaces, they cannot guarantee global ordinal relations and lack intraclass compactness ( image 3 at the red arrow).

[0077] Using our PnP-FOR, all results can be tuned to a nearly one-dimensional space, which is consistent with the real ordered label space, which also verifies the effectiveness of our motivation. Moreover, PnP-FOR not only preserves the local order relation, but also preserves the global distribution. On the other hand, PnP-FOR encourages in...

experiment example 3

[0078] Experimental Example 3: Effects of Different Sampling Strategies on PnP-FOR

[0079] We investigate two sampling strategies for batch training using PnP-FOR: 1) random sampling (Random), and 2) stratified sampling (Stratified), where all samples in a mini-batch have different labels. Figure 4 It shows that the results of stratified sampling are not as good as random sampling.

[0080] Stratified sampling causes SORD to slightly overfit after 25 epochs. This can be explained from two aspects. First, as shown in Equation (6), stratified sampling will pay more attention to the distance between adjacent ordered labels, because distant labels hardly affect the probability size, while random sampling contributes to the compactness between classes . Second, the probabilities in Equation (6) approximate the probabilities of batch samples. Therefore, stratified sampling is difficult to learn the true distribution of the data set.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the technical field of image classification, and particularly relates to a generative feature ordered regularization method for a depth ordered regression model. The method comprises the following steps: mapping an input image into low-dimensional feature representation through a deep convolutional neural network, and calculating the distance between the low-dimensional feature representations of all samples in a batch in a corresponding low-dimensional space; then calculating the distance between the ordered labels of all the samples in the batch; normalizing feature representation distance vectors and label distance vectors between all samples and other samples in the obtained batch; the divergence of the normalized feature vector and the tag distance vector is calculated, so that the distribution of the features in the embedding space is restrained to be consistent with the distribution of the ordered tags, namely, the orderliness of the features is ensured; the final loss function of the model comprises ordered regression loss and KL divergence loss; according to the method, the classification performance under various task scenes (such as face age estimation, medical image classification, historical image age classification and the like) can be improved.

Description

technical field [0001] The invention belongs to the technical field of image classification, and in particular relates to an image classification method with orderly labels. Background technique [0002] Ordinal regression (OR) is a classic problem in machine learning, dedicated to predicting data with ordered labels. Typical applications include face age estimation, where the image label is the age of the face, from young to old; scoring a movie, such as from one star to ten stars; disease diagnosis based on medical images, etc. Due to the ordered nature of the labels, OR is an intermediate problem between classification and regression [1][2]. [0003] In the past few years, deep neural networks have promoted the development of OR methods [1][2][3][4][5][6], namely deep ordinal regression methods. These methods mainly focus on modeling the mapping from feature representations derived from input data to ordered label spaces. For example, a popular approach to modeling ord...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06V10/764G06V10/774G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F18/24G06F18/214
Inventor 单洪明雷一鸣张军平
Owner FUDAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products