Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Two-dimensional video and three-dimensional scene fusion method and device, equipment and storage medium

A 3D scene and 2D video technology, applied in image data processing, instruments, etc., can solve problems such as large error, large deviation between video and scene in the surrounding area, video distortion, etc.

Pending Publication Date: 2021-01-05
洛阳众智软件科技股份有限公司
View PDF6 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Existing scene and video fusion technologies mostly use videos to directly paste models or vertically project videos. Usually, only the center area of ​​the videos projected by these methods can be approximated in a 3D scene. There is a large deviation between the video in the surrounding area and the scene, which will cause the projected video to appear distorted

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Two-dimensional video and three-dimensional scene fusion method and device, equipment and storage medium
  • Two-dimensional video and three-dimensional scene fusion method and device, equipment and storage medium
  • Two-dimensional video and three-dimensional scene fusion method and device, equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] In order to make the purpose, technical solution and advantages of the present invention clearer, the technical solution of the present invention will be described in detail below. Apparently, the described embodiments are only some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other implementations obtained by persons of ordinary skill in the art without making creative efforts fall within the protection scope of the present invention.

[0047] figure 1 It is a flow chart of a fusion method of 2D video and 3D scene provided by an embodiment of the present invention. see figure 1 , a fusion method of two-dimensional video and three-dimensional scene, comprising:

[0048] Step 101: Obtain depth texture information and pixel values ​​of the scene to be fused. Before performing video scene fusion, the entire scene is rendered first, and a depth texture is rendered on the basis of the normal rend...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a two-dimensional video and three-dimensional scene fusion method and device, equipment and a storage medium. The method comprises the steps of obtaining depth texture information and pixel values of a to-be-fused scene; obtaining world coordinates according to the depth texture information and the pixel viewport coordinates; obtaining to-be-fused pixel points by using a shadow volume algorithm according to the specified projection mode of the to-be-fused video, wherein the specified projection mode is generated according to user settings; calculating texture coordinates of the to-be-fused video corresponding to the to-be-fused pixel points according to the world coordinates; and fusing the to-be-fused video and the to-be-fused scene by taking the texture coordinates and the viewport coordinates as standards and combining the pixel values. According to the method, the coordinates of the three-dimensional scene are calculated by adopting the depth texture, and the projection of the video to the scene is realized by virtue of the shadow volume algorithm, so that seamless fusion of the video and the scene is realized.

Description

technical field [0001] The present invention relates to the technical field of video scene fusion, in particular to a fusion method, device, equipment and storage medium of two-dimensional video and three-dimensional scene. Background technique [0002] With the development of software technology, the display technology of 3D scenes is also becoming more and more sophisticated. The display of the 3D scene can be applied in many fields. The 3D scene display will involve the fusion of scenes and videos, such as the fusion of 3D scenes and videos in game scenes, or the fusion of scenes and videos in the monitoring field. Obviously, the fusion effect of 3D scene and video directly affects the display effect and user experience. [0003] Existing scene and video fusion technologies mostly use videos to directly paste models or vertically project videos. Usually, only the center area of ​​the videos projected by these methods can be approximated in a 3D scene. The video in the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/20
CPCG06T19/20
Inventor 丁伟
Owner 洛阳众智软件科技股份有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products