Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

WGAN model method based on depth convolution nerve network

A deep convolution and neural network technology, applied in the field of deep learning neural network, can solve the problems of slow speed, the discriminator cannot indicate the network training direction, and the generator cannot learn the characteristics of the data set, etc., to achieve a directional effect.

Inactive Publication Date: 2018-01-09
SOUTH CHINA UNIV OF TECH
View PDF3 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, in the traditional GAN ​​model, the loss function of the discriminator cannot indicate the direction of network training, that is, none of the indicators can reflect the quality of the generated image
The result of this situation is that the network will be trained endlessly, that is to say, there is no clear termination condition for the training of the generative confrontation network
[0004] In the standard WGAN model, the generator and the discriminator are traditional fully-connected layer structures. During the training process, the generator can only correct the image generated by itself through the back propagation of the loss function error of the discriminator.
In this case, the speed of network training is relatively slow, and the generator has no way to learn the characteristics of the data set.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • WGAN model method based on depth convolution nerve network
  • WGAN model method based on depth convolution nerve network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0025] This embodiment discloses a WGAN model method based on a deep convolutional neural network, which specifically includes the following steps:

[0026] Step S1, constructing a Wasserstein Generative Adversarial Network WGAN model, which includes a generator and a discriminator.

[0027] Step S2, constructing the discriminator into a deep convolutional neural network structure;

[0028] The discriminator is constructed in the form of a deep convolutional neural network. It is divided into several layers, and each layer has a corresponding convolution kernel, that is, a corresponding weight parameter.

[0029] Step S3, constructing the generator into a transposed convolutional neural network structure;

[0030] The number of convolutional network layers of the generator is the same as that of the discriminator, and the convolution kernel is the transpose of the convolution kernel of the discriminator.

[0031] Step S4, adopting the loss function of the Wasserstein distan...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a WGAN model method based on a depth convolution nerve network, and belongs to the deep learning nerve network field; the method comprises the following steps: S1, forming a Wasserstein generative confrontation network WGAN model; S2, forming a discriminator with the depth convolution nerve network structure; S3, forming a generator with a transposition convolution nerve network structure; S4, employing a Wasserstein distance loss function as a discriminator loss function; S5, preparing a data set so as to train the formed network. The method creatively proposes the depth convolution nerve network and WGAN combined forming mode according to the generative confrontation network model features, forms the generator and the discriminator as the depth convolution nerve network mode, and uses the WGAN loss function mode; the method can learn image features in the training process, and can use the loss functions to reflect the generative image quality.

Description

technical field [0001] The invention relates to the technical field of deep learning neural networks, in particular to a WGAN model method based on a deep convolutional neural network. Background technique [0002] Generative Adversarial Network (GAN for short) is a framework proposed by Goodfellow in 2014. It is based on the idea of ​​"game theory" and constructs two models of generator (generator) and discriminator (discriminator). Uniform noise of (0, 1) or Gaussian random noise generates images, and the latter discriminates the input image to determine whether it is an image from the dataset or an image produced by the generator. Every time the discriminator completes a judgment, it returns the result error to the generator. [0003] However, in the traditional GAN ​​model, the loss function of the discriminator cannot indicate the direction of network training, that is, none of the indicators can reflect the quality of the generated image. The result of this situation...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N3/04
Inventor 周智恒李立军
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products