Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Garment image retrieval method fusing color feature and residual network depth feature

A technology based on color features and network depth, applied in still image data retrieval, digital data information retrieval, still image data clustering/classification, etc. Keep the spatial structure and other issues to achieve the effect of increasing the calculation time and difficulty, the effect of style and color similarity is obvious, and the effect of improving retrieval efficiency

Pending Publication Date: 2020-02-21
WUHAN TEXTILE UNIV
View PDF0 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] (1) The existing technology only extracts the fully connected layer and cannot maintain the spatial structure, and this feature is more representative of global information, and the local feature information of the clothing picture is lost, resulting in a low average precision of retrieval
[0008] (2) When the residual network is directly applied to clothing image retrieval, the retrieved pictures often have the problem of similar styles but large color differences

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Garment image retrieval method fusing color feature and residual network depth feature
  • Garment image retrieval method fusing color feature and residual network depth feature
  • Garment image retrieval method fusing color feature and residual network depth feature

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0065] 1. Multi-feature fusion clothing image retrieval framework based on deep network

[0066] Multi-feature fusion clothing image retrieval based on deep network includes two processes of feature extraction and similarity measurement. Such as figure 2 As shown, in the feature extraction process, first input the pictures in the data set into the pre-trained network model, extract the deep features output through the network layer, and use the aggregation method to fuse other feature information as the global feature representation of the image. Stored in the feature library; the similarity measurement process is to input the clothing picture to be retrieved into the same neural network as the data set, and use the same aggregation method to obtain the global feature vector of the clothing picture to be queried. Query the distance between the image feature vector and the vector in the feature library to sort the similarity, and return the search results in ascending order o...

Embodiment 2

[0089] 1. Data and parameter preparation

[0090] In order to verify the effect of the method proposed by the present invention, this experiment selects Category and AttributePrediction Benchmark as a data set, which contains more than 200,000 sets of 50 categories of clothing pictures. This experiment extracts 60,000 training images from this subset. set, 20,000 test sets, and 20,000 validation sets, in which there are 30 categories of images. The experiment is compiled and implemented in Python.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of image retrieval, and discloses a garment image retrieval method fusing color features and residual network depth features, which comprises the followingsteps: inputting a training data set into a ResNet50-based network model; fusing the deep features and the color feature information to serve as global feature representation of the image; clusteringthe vectors in the feature library by using a K-Means algorithm; inputting a to-be-retrieved garment picture into the neural network the same as the data set, and obtaining a global feature vector ofthe to-be-queried garment picture; and sequentially calculating the distances between the vectors of the clustering centers and the vectors of the to-be-retrieved pictures, and performing similaritymeasurement through comparison of the distances to obtain a retrieval result. Experimental results show that the method can be combined with various feature information of the picture, the retrieval efficiency is high, and the time overhead is small. The extracted deep features have certain effectiveness and hierarchy. The method has high robustness and practicability and is superior to other mainstream retrieval methods.

Description

technical field [0001] The invention belongs to the technical field of image retrieval, and in particular relates to a clothing image retrieval method which combines color features and residual network depth features. Background technique [0002] Currently, the closest prior art: [0003] With the rapid development of the e-commerce industry, the clothing industry, as an important part of it, has an increasing amount of data. In order to deal with massive clothing image data, a new online clothing search mode is used by users - "searching images with pictures ", the core of which is image retrieval technology. As the core of clothing intelligent recommendation, clothing search and other applications, clothing image retrieval has broad market application prospects. Clothing shows a trend and taste of contemporary people, and a lot of semantic and detailed information is contained in it. The color matching and style of clothing are its important semantic information, and th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F16/55G06F16/583
CPCG06F16/55G06F16/5838Y02P90/30
Inventor 何儒汉侯媛媛刘军平彭涛陈常念胡欣荣
Owner WUHAN TEXTILE UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products