Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for Hash image retrieval based on deep learning and local feature fusion

A local feature and image retrieval technology, applied in computer parts, special data processing applications, instruments, etc., can solve the problems of dissimilarity in local details, large gap in overall outline details, inconsistent results, etc., and achieve fast and efficient image retrieval tasks Effect

Active Publication Date: 2017-05-17
HUAQIAO UNIVERSITY
View PDF4 Cites 104 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In addition, image retrieval based on deep learning generally extracts the features of the last convolutional layer or fully connected layer to directly perform similarity calculations, resulting in that although the final retrieved results are images of the same semantics, the local details between images are not similar. , because the high-level features have lost a lot of detailed information. For example, when searching for decorative clothing bags, industrial precision devices, and plant leaves in e-commerce, the images are similar in overall outline but have a large gap in details, resulting in retrieval results. Inconsistent with what the user expects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for Hash image retrieval based on deep learning and local feature fusion
  • Method for Hash image retrieval based on deep learning and local feature fusion
  • Method for Hash image retrieval based on deep learning and local feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The present invention will be further described below through specific embodiments.

[0040] figure 1 It is a schematic diagram of the deep learning network structure of the present invention. The network model framework of the present invention is a deep convolutional network based on the improvement of the GoogLeNet network structure, and the deep convolutional network structure is as follows figure 1 As shown, the network consists of five parts: input part, convolutional subnetwork part, local feature fusion part, hash layer coding part and loss function part. The input part contains images and corresponding labels, and the images are input in the form of triplets; the convolutional subnetwork part uses the convolutional part of the GoogLeNet network, and contains the original 3 loss layers; the local feature fusion module is mainly composed of convolution Layer and pooling layer, a merge layer and a fully connected layer; the coding part of the hash layer is compo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method for Hash image retrieval based on deep learning and local feature fusion. The method comprises a step (1) of preprocessing an image; a step (2) of using a convolutional neural network to train images containing category tags; a step (3) of using a binarization mode to generate Hash codes of the images and extract 1024-dimensional floating-point type local polymerization vectors; a step (4) of using the Hash codes to perform rough retrieval; and a step (5) of using the local polymerization vectors to perform fine retrieval. According to the method for Hash image retrieval based on deep learning and local feature fusion, an approximate nearest neighbor search strategy is utilized to perform image retrieval after two features are extracted, the retrieval accuracy is high, and the retrieval speed is quick.

Description

technical field [0001] The invention relates to the field of content-based image retrieval, in particular to a hash image retrieval method based on deep learning and local feature fusion. Background technique [0002] How to efficiently retrieve large-scale image data to meet the needs of users is an urgent problem to be solved. The traditional method is the image retrieval of the visual bag of words model, which is to first use the scale invariant feature transformation descriptor to extract the features of the image, and then use The hard clustering algorithm (K-Means) performs local feature clustering to obtain a visual dictionary, and finally counts the frequency of each visual word to generate a visual word histogram, and then matches and calculates image similarity. Since the initial feature extracted by the visual word bag model is Traditional manual descriptors, so the extracted features are relatively low-level, and cannot describe the high-level semantic informatio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/30G06K9/62G06K9/54
CPCG06F16/583G06V10/20G06F18/214
Inventor 杜吉祥聂一亮王靖范文涛张洪博刘海建
Owner HUAQIAO UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products