Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Unstructured light field rendering method

An unstructured, light field technology, applied in 3D image processing, image data processing, instruments, etc., can solve problems such as long time, inability to render images, lack of realism, etc., to improve realism, flexible rendering algorithms, The effect of eliminating ghosting

Active Publication Date: 2019-03-26
奥本未来(北京)科技有限责任公司
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At this stage, real-time 3D rendering uses models, textures, and materials to simulate the appearance of entities in the real world. The result lacks realism and cannot express complex lighting effects. In addition, although offline rendering has a better effect, it requires extremely long time
Existing light field rendering techniques require sampling positions to be evenly distributed on a surface of a fixed shape, and cannot render images of completely new viewpoint positions off the surface, and there are incorrect ghosting phenomena in light field rendering results

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unstructured light field rendering method
  • Unstructured light field rendering method
  • Unstructured light field rendering method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The following examples are used to illustrate the present invention, but are not intended to limit the scope of the present invention.

[0025] see figure 1 , this embodiment provides an unstructured light field rendering method, the rendering method includes the following steps:

[0026] S1: Data preparation: Obtain the sampled image of the light field, generate a basic triangular mesh with the position of the sampled viewpoint as the vertex, establish the camera parameters of the sampled viewpoint and the geometric model of the scene to be rendered, and record the scene under each sampled viewpoint according to the pixels of the sampled image depth value;

[0027] S2: Viewpoint triangular mesh splitting: Split the basic triangular mesh with the sampled viewpoint position as a vertex, and generate a sub-triangular mesh containing viewpoint index and mixed weight attribute for each vertex, the sub-triangular mesh Form a set of triangular meshes;

[0028] S3: Geometry...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an unstructured light field rendering method. obtaining a light field sampling image, taking the sampling viewpoint position as the vertex of the basic triangular mesh, the camera parameters of the sampling viewpoint and the geometric model of the to-be-rendered scene, recording the depth value of the scene under each sampling viewpoint, splitting the sampling viewpoint triangular mesh, and generating an independent sub triangular mesh for each vertex; Recording geometric information of the scene under the to-be-rendered viewpoint; And rendering the sampling viewpoint triangular mesh set, reprojecting to the viewpoint by utilizing the geometric information, sampling the light field image, and mixing the sampling results of the viewpoints. According to the method, the influence range and weight of sampling data are determined through the triangular mesh formed by the viewpoint positions when the sampling light field is split, the scene geometric model is re-projected, bilateral filtering is carried out on light field sampling based on the depth, and therefore the scene image under any viewpoint is rendered. Light field sampling data of various sources can befully utilized, and a vivid rendering result can be generated in real time.

Description

technical field [0001] The invention relates to the technical field of graphics processing, in particular to an unstructured light field rendering method. Background technique [0002] A light field is a parametric representation of a four-dimensional light radiation field that contains both position and direction information in space, and is the totality of all light radiation functions in space. The real information of the entire space environment can be obtained at any angle and position in the space, and the image information obtained by using the light field is more comprehensive and of better quality. In an unstructured environment, the performance of surface materials is uneven, the structure and size change are irregular and unstable, and the environmental information is not fixed, unknown, and indescribable. [0003] At this stage, real-time 3D rendering uses models, textures, and materials to simulate the appearance of entities in the real world. The result lacks ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T15/50
CPCG06T15/50
Inventor 沈方阳储备涂子豪雷宇贾梦
Owner 奥本未来(北京)科技有限责任公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products