Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Image style migration method based on deep convolutional neural network

A convolutional neural network and neural network technology, applied in the field of image style transfer based on deep convolutional neural network, can solve the problems of time-consuming high-resolution images and impreciseness

Inactive Publication Date: 2017-05-10
SHENZHEN WEITESHI TECH
View PDF0 Cites 85 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problems of time-consuming style transfer process and imprecise transfer to high-resolution images, the purpose of the present invention is to provide an image style transfer method based on deep convolutional neural network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image style migration method based on deep convolutional neural network
  • Image style migration method based on deep convolutional neural network
  • Image style migration method based on deep convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.

[0032] figure 1 It is a system flowchart of an image style transfer method based on a deep convolutional neural network in the present invention. It mainly includes image input; loss function training; stylization; image enhancement; image refinement.

[0033] Wherein, in the image input, an artistic painting is selected as the style image; before any image is input into the multimodal convolutional neural network, it is adjusted to have a resolution of 256×256 with a bilinear downsampling layer content image.

[0034] Wherein, in the loss function training, all the output images of the multimodal transfer network are used as the input of the loss network, the stylized loss value of ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an image text description method based on a visual attention model. The main content comprises the followings: image inputting, loss function training, stylizing, image enhancing and image thinning; and the processes are as follows: an input image is firstly adjusted as a content image (256*256) with a dual-linear down-sampling layer, and then stylized through a style subnet; and then a stylized result as the first output image is up-sampled as an image in the size of 512*512, and then the up-sampled image is enhanced through an enhancement subnet to obtain the second output image; the second output image is adjusted as the image in the size of 1024*1024, and finally, a thinning subnet deletes locally pixelated artifact and further thins the result to obtain a high-resolution result. By use of the image style migration method disclosed by the invention, the brushwork of the artwork can be simulated more closely; multiple models are combined into a network so as to process the image with bigger and bigger size shot by a modern digital camera; and the method can be used for training the combined model to realize the migration of multiple artistic styles.

Description

technical field [0001] The invention relates to the field of deep learning, in particular to an image style transfer method based on a deep convolutional neural network. Background technique [0002] With the rapid development of science and technology, in the field of deep learning research, feed-forward neural network has been widely used for fast style transfer, and transferring the artistic style of works of art to daily photos has become a very important task in academia and industry. Popular computer vision tasks. However, when these stylized networks are directly applied to high-resolution images, the style of local regions usually does not look very similar to the desired artistic style, because the transfer process cannot capture small, complex textures and maintain the correctness of the artwork. texture scale. And because it is an online iterative optimization process, the migration runtime is very long. However, if the image style transfer method based on deep...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T3/00
CPCG06T3/04
Inventor 夏春秋
Owner SHENZHEN WEITESHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products