Color consistency adjusting method for real-time video fusion

A real-time video and adjustment method technology, applied in the field of enhanced virtual environment, can solve problems such as poor robustness and inconsistent texture of spliced ​​models

Active Publication Date: 2019-12-10
北京大视景科技有限公司
View PDF3 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is: to overcome the poor robustness of the prior art, to provide a color consistency adjustment method for real-time video fusion, which can solve the problem of inconsistency in the texture of the mosaic model in the AVE scene, and can also improve the visual experience of the user observing the scene the goal of

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Color consistency adjusting method for real-time video fusion
  • Color consistency adjusting method for real-time video fusion
  • Color consistency adjusting method for real-time video fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0082] Below in conjunction with accompanying drawing, the present invention is described in further detail, before introducing the specific implementation method of the present invention, at first some basic concepts are explained:

[0083] (1) Fusion of virtual and real: Fusion display of virtual 3D models and real pictures or videos;

[0084] (2) Image modeling: collect images in the real scene, and model the scene based on a single image;

[0085] (3) Three-dimensional overlapping area: the model obtained based on image modeling is called a video model, each model is composed of multiple mesh patches, and the overlapping mesh patches of adjacent models are the three-dimensional overlapping area;

[0086] (4) Channel: the representation form of the complete image, each pixel in the image is described by three values ​​of RGB, and the image corresponds to three channels of R, G, and B;

[0087] (5) Histogram: used to count the proportion of each intensity in the image area,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a color consistency adjustment method for real-time video fusion, and the method comprises the steps: (1) respectively carrying out geometric intersection calculation of a three-dimensional model for adjacent cameras through employing respective camera view cones, solving a common three-dimensional overlapping patch, and further calculating a two-dimensional projection region of the common three-dimensional overlapping patch in a corresponding image; (2) for all cameras, constructing a camera topology according to a projection area proportion relationship, and performing optimization adjustment on all camera pictures in the topology by using a chain type adjustment strategy of a color histogram to obtain an overall visual effect with consistent colors; and (3) dueto the limitation of computing resources, indicating color consistency cannot be quickly carried out on all camera pictures, so that real-time color consistency optimization scheduling is provided onthe basis of the step (2). Color consistency textures can be provided for virtual-real fusion based on video projection, a high-quality texture grid without visual splicing seams is generated, and themethod can also be used for improving the image/video splicing effect.

Description

technical field [0001] The invention relates to the technical field of enhanced virtual environment, and more specifically, relates to a color consistency adjustment method for real-time video fusion. Background technique [0002] Augmented Virtual Environment (AVE) is a technology used to display multiple pictures or video streams from any angle in a 3D virtual environment. It has important applications in the field of integrated monitoring systems. Fusion methods are generally divided into two categories, video projection-based virtual-real fusion methods and video-real fusion methods based on single photo modeling. The former requires a highly accurate 3D model, and it is difficult to align the virtual and the real, and the texture is often distorted. The fusion method based on image modeling has more obvious advantages in efficiency and effect. In real surveillance scenarios, different cameras, different parameters of the same camera, or changes in lighting will cause...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50G06T7/44G06T7/90G06T19/00G06T19/20
CPCG06T5/50G06T7/90G06T7/44G06T19/006G06T19/20G06T2207/10016G06T2207/20221Y02T10/40
Inventor 周颐孟明游景帝周忠
Owner 北京大视景科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products