Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image retrieval method

An image retrieval and image technology, applied in the field of extended query retrieval, can solve problems affecting retrieval efficiency and achieve the effect of improving retrieval efficiency

Inactive Publication Date: 2014-01-01
COMMUNICATION UNIVERSITY OF CHINA +1
View PDF4 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] It can be seen that the accuracy of the query target visual words greatly affects the retrieval efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image retrieval method
  • Image retrieval method
  • Image retrieval method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The technical scheme of the present embodiment is as follows:

[0028] First, use the bag-of-visual-words model to transform the image into a set of visual words. The transformation process is as follows: figure 1 shown. The specific transformation process is as follows: perform feature detection on the image, obtain salient feature points or salient areas, perform feature description and then obtain local feature description vectors; perform feature extraction and sampling on images in the entire image library to obtain local feature sets as feature training gather. K-means clustering is performed on the feature training set, and each cluster center is regarded as a "visual word", and all cluster centers form a "visual vocabulary". The local feature set extracted from a single image is quantized into a set of visual words. During quantization, each local feature description vector is compared with the feature vectors represented by all visual words in the visual voc...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an image retrieval method, and belongs to the field of intelligent information processing such as multimedia information retrieval, mode identification and the like. A correctly-matched correlated image is obtained by using geometric verification after initial retrieval, weight adjustment is performed on document vectors of the correlated image and an inquiry image to construct a new inquiry vector so as to obtain extended inquiry, and new retrieval is performed to obtain a retrieval result. According to the method, weights of implicit visual words existing in the correlated image are added in the inquiry vector, so that the weights of the same visual words in the inquiry image and the correlated matched image are increased, and the retrieval efficiency is increased greatly.

Description

technical field [0001] The invention belongs to the field of intelligent information processing such as multimedia information retrieval and pattern recognition. In particular, it relates to an extended query retrieval method for target retrieval. Background technique [0002] Object retrieval technology based on the bag of visual words model has become a research hotspot in recent years. The bag of visual words model uses the local features of the training image to form a "visual vocabulary" in advance, and uses the "visual vocabulary" to quantify the local features of the image, and approximates the similar local features of the image as their clustering center-" Sight Words". From this, an image is represented as a collection of "visual words". Subsequently, people use the inverted index table to store the "visual word" of the image, and use the TF-IDF model in text retrieval to retrieve the image. [0003] Due to the inaccuracy of visual words caused by the omission ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/30
CPCG06F16/583
Inventor 黄祥林吕慧曹学会杨丽芳张建生张枫韩笑
Owner COMMUNICATION UNIVERSITY OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products