Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hand-drawn sketch retrieval method based on deformable convolution and depth network

A deep network and hand-drawing technology, applied in the field of computer vision and deep learning, can solve redundancy and other problems, achieve the effect of reducing interference, improving retrieval accuracy, and retaining feature expression ability

Active Publication Date: 2019-02-01
CHINA UNIV OF PETROLEUM (EAST CHINA)
View PDF5 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

When using deep neural networks to extract features, the current mainstream neural networks use regular convolution for feature extraction. Regular convolution is very effective for information-intensive natural images, but for hand-drawn sketches, a large number of useless features will be extracted and there are Severely redundant, i.e., traditional neural network structures are not well suited for hand-drawn sketches

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hand-drawn sketch retrieval method based on deformable convolution and depth network
  • Hand-drawn sketch retrieval method based on deformable convolution and depth network
  • Hand-drawn sketch retrieval method based on deformable convolution and depth network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] Below in conjunction with accompanying drawing and specific embodiment the present invention is described in further detail:

[0033] combine figure 1 , figure 2 with image 3 As shown, the hand-drawn sketch retrieval method based on deformable convolution and deep network includes the following steps:

[0034] s1. Obtain hand-painted images to be retrieved and natural images in the database

[0035] The method of the present invention is applicable to all natural picture libraries and hand-painted image data sets, wherein, the training data in the present invention comes from the public data set Flickr15k image data set, because this data set is currently recognized by everyone in this field, And the data set contains a large number of hand-painted images and natural picture data.

[0036] s2. Use the edge detection algorithm to detect the edge of the natural image to obtain a hand-like drawing, that is, the edge map

[0037] s3. Preprocessing the hand-drawn sket...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of computer vision and depth learning, the invention particularly discloses a hand-drawn sketch retrieval method based on deformable convolution and depth network, The method comprises the following steps: S1, acquiring a hand-drawn sketch and natural color map database S2, A natural color image is converted into an edge map S3 by an edge detection algorithm, Thedepth network S5 based on deformable convolution is trained by preprocessing S4 of the hand-drawn sketch and the edge map through morphological operation, and the depth features S6 of the hand-drawnsketch and the edge map of the natural image are extracted respectively by using the trained depth network, and similarity between the features is calculated and the retrieval result is returned. Themethod of the invention has the beneficial effects that deformable convolution is incorporated into the traditional neural network, the limitation of the standard convolution on the hand-drawn sketchcan be broken, the robustness of extracting features of the network hand-drawn image can be improved, and the feature redundancy can be reduced. The network structure proposed by the invention can greatly improve the retrieval precision of the hand-drawn sketch.

Description

technical field [0001] The invention belongs to the field of computer vision and deep learning, and relates to a hand-drawn sketch retrieval method based on deformable convolution and deep network. Background technique [0002] Hand-drawn sketch retrieval is a kind of content-based image retrieval technology. Studies have shown that hand-drawn pictures have the same mechanism as real pictures to activate the visual area of ​​human cerebral cortex. Compared with text-based image retrieval and traditional image retrieval based on natural color maps, hand-drawing has the advantages of easy acquisition, strong abstraction and generalization, and no language and cultural restrictions. The number of touch-screen devices is increasing day by day, and the acquisition of hand-drawn images is becoming easier. Image retrieval based on hand-drawn sketches is currently receiving more and more attention and has broad application prospects. For example, commercial online shopping malls us...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/53G06K9/46G06K9/62
CPCG06V10/44G06F18/2413
Inventor 刘玉杰王文超于邓李冠林
Owner CHINA UNIV OF PETROLEUM (EAST CHINA)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products