Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Automatic image annotation method based on cross-media sparse theme code

An automatic image annotation and cross-media technology, applied in the field of automatic image annotation based on cross-media sparse topic coding, can solve the problem that the probabilistic topic model cannot effectively control the sparsity of the underlying representation

Inactive Publication Date: 2018-04-20
XI AN JIAOTONG UNIV
View PDF2 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the probabilistic topic model achieves superior performance for image annotation, there are two limitations: the probabilistic topic model cannot effectively control the sparsity of the latent representation; This method is not suitable for scenarios where a word is associated with multiple images

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automatic image annotation method based on cross-media sparse theme code
  • Automatic image annotation method based on cross-media sparse theme code
  • Automatic image annotation method based on cross-media sparse theme code

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0171] The specific implementation process of the present invention is divided into the following two stages:

[0172] 1. In the training model stage, 4,600 pictures crawled from the Wikipedia for schools course website are used as the training set, and 1-5 annotations are generated for each picture according to the text around the pictures on the original webpage. Such as figure 2 , using this dataset for model training includes the following steps:

[0173] Step 101: Split the 4600 training images into some 20×20 tiles by sliding a 20-pixel wide window, and extract the 128-dimensional SIFT (Scale Invariant FeatureTransform) description from these 20×20 grayscale tiles symbol, and add an additional 36-dimensional color descriptor to the SIFT descriptor, so each picture is represented as an n×164-dimensional matrix (n is the number of SIFT feature points), or 164-dimensional features Collection of vectors. Among them, Scale-invariant feature transform (Scale-invariant feat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an automatic image annotation method based on a cross-media sparse theme code. The method comprises the steps that firstly an image word is generated by an image in a multi-media document, a word bag model is used for representing an annotation word in the multi-media document as a vector, and the processed multi-media document is obtained; according to the processed multi-media document and a probability theme model, an image cross-media sparse theme code model is obtained; by using a maximum posterior probability estimation method, the image word and the annotation word in the multi-media document and a joint distribution formula of a relation code variate of the image word and the annotation word are obtained, and an image word code, a multi-media document codeand a relation code in the joint distribution formula are modeled by adopting Laplace transcendence and super-Gaussian; a coordinate descent method is used for optimally figuring out a cross-media sparse theme code model, and then the cosine similarity degree between an image code and an annotation word code is calculated and subjected to image annotation. According to the automatic image annotation method based on cross-media sparse theme coding, the complexity of annotation time and space is lowered, the accuracy rate of image annotation is guaranteed, and meanwhile the efficiency is guaranteed.

Description

technical field [0001] The invention belongs to the field of computer application, image processing and data mining, and in particular relates to an automatic image labeling method based on cross-media sparse topic coding. Background technique [0002] With the development of the Internet and the popularization of digital devices, the amount of image resource data has grown exponentially. How to effectively retrieve and manage image resources is an important topic in the field of computer vision. Traditional content-based image retrieval uses low-level image features, which cannot establish a good relationship with high-level semantics, that is, there is a semantic gap, which makes retrieval quality difficult to meet the requirements. The automatic annotation of images is to establish the mapping or association relationship between the image visual feature space and the high-level semantic space through known images, that is, to project the two heterogeneous media data of a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30G06K9/62H03M7/30
CPCH03M7/3059G06F16/5866G06F18/22
Inventor 刘均宋凌云罗敏楠杨宽张玲玲阮建飞
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products