Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Optical-field-camera-based realization method for three-dimensional scene recording and broadcasting

A technology of three-dimensional scene and realization method, applied in image communication, selective content distribution, electrical components, etc.

Inactive Publication Date: 2016-07-20
SHENZHEN LIMBOWORKS TECH CO LTD
View PDF4 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, the data collected by the existing video shooting method is only a plane image of one or more preset positions and angles. Generally, the director or director controls to switch the appropriate angle.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Optical-field-camera-based realization method for three-dimensional scene recording and broadcasting
  • Optical-field-camera-based realization method for three-dimensional scene recording and broadcasting

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] The technical solution of this patent will be further described in detail below in conjunction with specific embodiments.

[0017] see Figure 1-2 , a method for realizing 3D scene recording and broadcasting based on a light field camera, the specific steps are as follows:

[0018] (1) The light field information is collected simultaneously by multi-angle and multiple light field cameras, and the main control computer collects these information and converts the light field information into a depth point cloud with RGB colors;

[0019] (2) The main control computer synchronously fuses and reconstructs the data of multiple light field cameras into a dynamic 3D scene data, and these data are scenes depicted in the form of point clouds

[0020] (3) The main control computer performs grid optimization on the point cloud data into 3D model and texture information;

[0021] (4) The main control computer transmits the 3D model data to the user client;

[0022] (5) The user c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an optical-field-camera-based realization method for three-dimensional scene recording and broadcasting. A plurality of optical-field cameras with multiple angles collect optical field information simultaneously; a main control computer collects the information and converts the optical field information into depth point clouds with RGB colors; the main control computer carries out synchronous fusion on data of the multiple optical-field cameras to reconstruct dynamic 3D scene data, wherein the data are scenes described in a point cloud mode; the main control computer carries out mesh optimization on the point cloud data to obtain a 3D model and chartlet information; the main control computer transmits the 3D model data to a user client; and user client software constructs a virtual 3D scene based on the data and carries out rendering. According to the invention, the main control computer collects the information and converts the optical field information into the depth point cloud data with the RGB colors to realize 3D scene reconstruction and then the virtual camera carries out 3D scene rendering again. Therefore, free changing at any angle is realized; and a rendering distance and an illumination direction can be adjusted.

Description

technical field [0001] The invention relates to a method for realizing three-dimensional scene recording and broadcasting, in particular to a method for realizing three-dimensional scene recording and broadcasting based on a light field camera. Background technique [0002] With the development of video and sensor technology, digital video content is gradually upgraded from a single low-resolution content format to ultra-high-definition, 3D and other content formats. For video shooting in a certain space such as sports events and indoor TV programs, multi-camera One-bit, multi-angle collaborative shooting is already a very commonly used technology. With the rise of VR and 3D technology, the use of 360-degree panoramic cameras and 3D cameras for shooting is also becoming more and more popular. [0003] However, the data collected by the existing video shooting method is only one or more planar images with preset positions and angles, and generally the appropriate angles are c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N21/218H04N21/44H04N21/81H04N13/02
CPCH04N21/21805H04N13/243H04N21/44H04N21/816
Inventor 黄翔
Owner SHENZHEN LIMBOWORKS TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products