Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-user Natural Scene Labeling Ranking Method

A technology of natural scenes and sorting methods, applied in the fields of instruments, character and pattern recognition, computer parts, etc., can solve the problems of inaccurate labeling results of a single user, not making full use of labeling and sorting information, etc., to achieve value-added information, fast speed , the effect of high accuracy

Active Publication Date: 2017-01-18
SOUTHEAST UNIV
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, there are two major defects in the labeling and sorting of natural scene images; first, the single-user labeling method is used, that is, there is only one labeling result for an image, but due to the subjective factors of the user itself, the labeling result of a single user may not be accurate enough
2. Using the method of extending a specific multi-label learning algorithm to solve the label sorting problem does not make full use of the existing label sorting information of the data itself

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-user Natural Scene Labeling Ranking Method
  • Multi-user Natural Scene Labeling Ranking Method
  • Multi-user Natural Scene Labeling Ranking Method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present invention will be described in detail below in conjunction with the accompanying drawings and specific examples.

[0028] Such as figure 1 As shown, the multi-user natural scene mark sorting method of the present invention comprises the steps:

[0029] (1) Obtain a set of natural scene images for training, and extract feature vectors for each natural scene image;

[0030] (2) Obtain multiple tag rankings of natural scene images from the image tagging system based on user interest; the image tagging system based on user interest is that users sort the related tags of each natural scene image according to their own interests, and then the system Automatically store the results of tag sorting into the database.

[0031] (3) By constructing a nonlinear optimization problem and using the interior point method to solve it, multiple label rankings are converted into a label distribution. The specific construction method of the nonlinear optimization problem is: ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a multi-user natural scene mark sequencing method. The multi-user natural scene mark sequencing method includes the steps of firstly, extracting feature vectors of natural scene image sets used for conducting training; secondly, obtaining multiple mark sequences of natural scene images from an image marking system on the basis of user interestingness degrees; thirdly, converting the multiple mark sequences into mark distribution; fourthly, obtaining the natural scene images to be marked and sequenced from an input device, and extracting the feature vectors; fifthly, judging whether training is well conducted or not; sixthly, training the optimum parameter vector theta of a natural scene mark distribution model; seventhly, substituting the optimum parameter vector theta and the feature vectors of the natural scene images to be marked and sequenced into the natural scene mark distribution model, and obtaining the mark distribution of the natural scene images to be marked and sequenced; eighthly, treating marks with the description degrees smaller than the description degree of a virtual mark as irrelevant marks, treating the rest of marks as relevant marks, and finally sequencing the relevant marks according to the magnitudes of the description degrees.

Description

technical field [0001] The invention relates to a method for sorting multi-user natural scene marks by using a computer, and belongs to the technical field of image processing. Background technique [0002] At present, the development of Internet information technology and the popularization of digital equipment have brought about the explosive growth of image data. The emergence and dissemination of a large number of digital images has brought convenience to people and enriched people's lives. However, people's use and selection of images Capability does not increase with the number of images, which creates new challenges for users to exploit. Therefore, how to use computers to automatically, quickly and accurately classify and sort images according to people's wishes has become an urgent task at present. [0003] A natural scene image can often be marked with various conceptual tags, and users can sort the relevance of these tags according to their own understanding of th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62
Inventor 耿新罗龙润
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products