Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for processing cosmically complex three-dimensional scene based on eight-fork tree

A three-dimensional scene, large-scale technology, applied in the direction of image data processing, 3D image processing, instruments, etc., to achieve the effect of easy organization and expansion, increased speed, and improved rendering speed

Inactive Publication Date: 2008-10-08
SHANGHAI UNIV
View PDF0 Cites 69 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The purpose of the present invention is to overcome the shortcomings of existing graphics rendering engines in real-time rendering of large-scale and complex 3D scenes, and to provide a method for processing large-scale and complex 3D scenes based on octrees, which uses octrees to organize and optimize scenes in tree form Management, on this basis, the bounding box of the 3D scene is used to complete the fast viewshed culling, so as to achieve the purpose of optimizing the management of the entire complex scene, greatly improving the rendering rate, and meeting the real-time requirements

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for processing cosmically complex three-dimensional scene based on eight-fork tree
  • Method for processing cosmically complex three-dimensional scene based on eight-fork tree
  • Method for processing cosmically complex three-dimensional scene based on eight-fork tree

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] A preferred embodiment of the present invention is described as follows in conjunction with accompanying drawing:

[0032] This large-scale complex 3D scene processing method based on octree is divided into four steps:

[0033] Step 1: Load a large-scale 3D scene, and use a scene graph to organize all elements in the scene.

[0034] For a graphics rendering engine, how to organize the elements in the scene is a crucial issue. A good scene organization method is not only conducive to memory management, but also can speed up the rendering of the scene. The present invention adopts the scene graph to organize the scenes. This scene organization mode is not only beneficial to memory management and reduces memory overhead, but also can accelerate scene rendering, and is also conducive to the realization of real-time interaction.

[0035] Step 2: Establish and generate the octree structure of the scene, and record relevant information.

[0036] 1. Establish the basic data ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a large scale complex three-dimensional scene processing method based on the octree, belonging to the computer graphics and the virtual reality field, including the steps: (1) loading the large scale three-dimensional scene, organizing all the elements in the scene by the scene drawings; (2) building and generating the octree structure of the scene, recording the related information; (3) using the octree structure, performing the fast sight elimination through the crossing detection algorithm of the retinal cone body and the bounding box; (4) rendering the objects in the retinal cone body. The invention considers the deficiency to render the large scale complex three-dimensional scene when the present drawing renders the engine, uses the octree structure to organize the scene, cuts the geometrical nodes outside the retinal cone body by the space information and the tree structure, fast computes the node sequence required to render, reduces the number of the triangle surfaces transmitted to the rendering channels, thereby effectively advancing the rendering speed, reaching the realtime requirement.

Description

technical field [0001] The invention relates to computer graphics and virtual reality technology, and mainly relates to a large-scale complex three-dimensional scene processing method based on an octree. Background technique [0002] Real-time rendering of large-scale scenes has a very wide range of applications in the fields of virtual reality, geographic information system, flight simulation, urban planning and 3D games, and has always been a research hotspot. [0003] Although the performance of graphics processors has developed rapidly compared to the past, it still cannot meet the needs of real-time rendering of large-scale and complex scenes. Therefore, efficient algorithms must be designed to further improve the rendering speed of complex scenes. Different from photorealistic rendering, on the premise of not affecting the visual effect, the graphics technology in virtual reality can properly lose the rendering quality of the graphics to improve the rendering speed. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T15/00G06T9/40
Inventor 万旺根余小清周俊玮林继承
Owner SHANGHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products