Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A projection mapping method of a video in a three-dimensional scene based on a screen space

A three-dimensional scene and screen space technology, applied in the fields of texture mapping and projection, to achieve the effects of robust results, clear algorithms and high operating efficiency

Active Publication Date: 2019-03-29
ZHEJIANG UNIV
View PDF10 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The present invention proposes a screen-space-based video projection mapping method in a three-dimensional scene, with the purpose of solving the problem of occlusion penetration caused by missing depth values ​​in the projection texture mapping algorithm. By projecting the texture into the three-dimensional scene, the texture and scene better blend together

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A projection mapping method of a video in a three-dimensional scene based on a screen space
  • A projection mapping method of a video in a three-dimensional scene based on a screen space
  • A projection mapping method of a video in a three-dimensional scene based on a screen space

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] The present invention will be described in detail below in conjunction with the embodiments and accompanying drawings, but the present invention is not limited thereto.

[0036] The flow of the texture projection mapping algorithm based on screen space in this embodiment is as follows: figure 1 As shown in the figure, it includes three steps: the preprocessing process of the projection source, the drawing of the 3D scene from the current viewpoint, and the drawing of the texture from the projection source.

[0037] (1) Pretreatment process

[0038] For each projection source in the scene, it is necessary to obtain parameter information such as the position and orientation of the projection source in the 3D scene. According to the obtained internal and external parameters of the projection source, the model matrix M of the projection source, the viewpoint matrix V and the projection matrix P can be calculated respectively, and the MVP matrix 7T of the projection can be ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a projection mapping method of a video in a three-dimensional scene based on a screen space, comprising the steps of preprocessing to obtain internal and external parameters ofa projection source and a depth map; Drawing three-dimensional scene from current viewpoint, obtaining diffuse reflection map and depth map of current viewpoint; The texture is drawn from the projection source, and each pixel of the projection picture is converted to the screen space of the viewpoint for projection calculation. The problem of occlusion penetration caused by the loss of depth value in traditional projection texture mapping is solved, the method not only enhances the rendering effect significantly, but also reduces the number of rendering batches and enhances the rendering efficiency. The invention has the advantages of clear algorithm, robust result and high operation efficiency, and is very suitable for application in real-time rendering system. The method can be well combined with a large-scale three-dimensional video monitoring system.

Description

technical field [0001] The invention relates to texture mapping and projection in computer graphics, in particular to a screen-space-based video projection mapping fusion method in a three-dimensional scene. Background technique [0002] In recent years, for the direction of 3D video surveillance, the current relatively common method is to collect video data from surveillance cameras in various locations in the city, and display them in 3D scenes in the form of annotations. If the monitoring personnel want to call the monitoring of a certain location, they only need to turn on the monitoring camera in the corresponding area to observe the current real-time monitoring video. However, this method simply combines the two-dimensional surveillance video with the three-dimensional scene, that is, watching the surveillance video in the three-dimensional scene. Compared with the video wall, it only adds the three-dimensional position information of the camera, and does not fully di...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T15/04
CPCG06T15/04
Inventor 郑文庭李融鲍虎军
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products