Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Stable controllable image generation model training method based on W distance

An image generation and model training technology, applied in the field of image recognition, can solve the problems of unstable model training gradient and unstable model gradient descent direction, and achieve the effect of ensuring normal convergence, alleviating mode collapse, and increasing stability.

Active Publication Date: 2020-12-11
HEBEI UNIVERSITY
View PDF4 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0028] The purpose of the present invention is to provide a stable and controllable image generation model training method based on W distance to solve the problems of unstable model training gradient and unstable model gradient descent direction in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Stable controllable image generation model training method based on W distance
  • Stable controllable image generation model training method based on W distance
  • Stable controllable image generation model training method based on W distance

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0078] The stable and controllable image generation (CWBLI) model training method based on W (Wasserstein) distance of the present invention comprises the following steps:

[0079] a. Preprocessing the image data to obtain the sample data of the training set and the test set.

[0080] Using the CelebA face dataset as sample data, divide the CelebA face dataset into a training set and a test set. Specifically, 180,000 images can be selected as the training set, and the remaining 22,599 images can be used as the test set. In the original CelebA dataset, each image has 178×218 pixels. In order to highlight the features we need and reduce the complexity of model training, each image in the training set and test set is cropped, such as Figure 11 As shown, the most classic 000001.jpg is taken as an example to show the processing flow, crop out a picture with a fixed face position in each image with a size of 64×64 pixels, and then normalize the cropped picture data.

[0081] b. Co...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a stable controllable image generation model training method based on a W distance, and the method comprises the following steps: a, preprocessing image data to acquire sample data of a training set; b, constructing a stable controllable image generation model based on a W distance; c, constructing an overall model according to a loss function of each network; d, alternatelyiterating model training by a gradient descent algorithm to ensure normal convergence of model parameters; and e, after the model parameters converge, respectively separating a generator network G, an encoder network E and a classifier network C as independent service products. The invention is an end-to-end network model, improves a model in the prior art from two aspects of a model structure and a distribution measurement standard, solves the problems of unstable model training gradient and unstable model gradient descending direction, can stably and directionally generate an image sample.Robustness in the model training process and controllability of the generated sample are improved.

Description

technical field [0001] The invention relates to the technical field of image recognition, in particular to a stable and controllable image generation model training method based on W distance. Background technique [0002] In recent years, Generative Adversarial Networks (GAN) and its derived generative models have been one of the core topics in the machine learning and deep learning communities, among which BiGAN is the most important in high-dimensional complex data modeling. One of the deep generative models. The BiGAN model introduces the encoder network E into the GAN model so that the latent variables of similar samples can be gathered together during the encoding process, making the low-dimensional manifold continuous and achieving the effect of implicit regularization, which can improve the model's general ability. [0003] However, in the training process of the BiGAN model, it is very easy for the support set of the real data distribution and the generated sample...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08G06K9/62
CPCG06N3/084G06N3/045G06F18/24
Inventor 董春茹刘轶功花强张峰赵世朋
Owner HEBEI UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products