Model texture generation method based on generative adversarial network

A network and texture technology, applied in the field of computer graphics, can solve the problems of large texture color difference, overlapping colors at texture boundaries, cracks, etc., to achieve the effect of saving production costs, reducing production costs, and ensuring diversity

Pending Publication Date: 2019-07-26
HANGZHOU DIANZI UNIV
View PDF2 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The 3D model is composed of many triangles, but these triangles themselves have no color value
However, the texture image obtained by this method will be affected by the actual lighting conditions. Unused sampling environments and sampling positions will lead to different lighting methods of the texture image, resulting in a large color difference between different textures when the final texture is colored. At the same time, texture boundaries will be different. There are phenomenon of overlapping colors and cracks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model texture generation method based on generative adversarial network
  • Model texture generation method based on generative adversarial network
  • Model texture generation method based on generative adversarial network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] In order to describe the present invention more specifically, the method for generating model textures of the present invention will be described in detail below in conjunction with the drawings and specific embodiments.

[0052] Such as figure 1 As shown, a model texture generation method based on generative confrontation network includes the following steps:

[0053]In step (1), for the input model, that is, a 3D model without texture maps, according to a given sampling rule, sample pictures of the model are obtained from multiple perspectives, specifically including the following steps:

[0054] 1-1. By setting the coordinates of the model, move the center position of the model to the coordinate origin of the virtual world,

[0055] 1-2. Set the initial position of the camera, the orientation of the lens and the size of the viewport,

[0056] 1-3. According to the multiple sampling points set, model sampling is carried out in sequence, specifically:

[0057] 1-3-1...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a model texture generation method based on a generative adversarial network. The method comprises the following steps: (1) for an input model, namely a three-dimensional modelof a texture-free map, acquiring sampling pictures of the model from a plurality of visual angles according to a given sampling rule; (2) carrying out edge detection processing on the sampling picture, and converting the sampling picture into a line frame picture; (3) establishing and training a generative adversarial network for generating color textures, taking the line frame diagram as the input of the network, and outputting a color texture diagram through the processing of the generative adversarial network; and (4) mapping the plurality of color texture images output by the network to the original model to realize texture coloring and obtain the three-dimensional model with the texture map. According to the method, the texture picture can be generated by building and training the deep convolution-based generative adversarial network, the manufacturing efficiency of the texture map is improved, the cost is saved, meanwhile, the environmental factors of texture generation are alsounified, and the consistency of final texture coloring is ensured.

Description

technical field [0001] The invention belongs to the technical field of computer graphics, and in particular relates to a method for generating model textures based on generative confrontation networks. Background technique [0002] Model texture plays an important role in the field of 3D model making. A 3D model is composed of many triangles, but these triangles themselves have no color value. If you want the surface of the 3D model to have a sense of color and unevenness, you must create a texture map and map it to the surface of the model in a specific way, so that the model has a more realistic visual effect. Therefore, the level of realism and detail of a model depends on the level of detail of its textures. [0003] At present, the main production method of model texture is that art engineers first carry out UV unwrapping on the model, set up the texture layout, and then use professional digital drawing software to manually draw the corresponding texture pattern. The...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T15/04G06T7/13
CPCG06T15/04G06T7/13
Inventor 黄枭王毅刚
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products