Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for integrating virtual light and real light of video scene

A video scene and fusion method technology, applied in the field of virtual and real lighting fusion of video scenes, can solve the problems of discontinuity of light and shadow effects of virtual objects, difference of estimated information, and ignoring the correlation between video frames and frames.

Active Publication Date: 2013-02-13
BEIHANG UNIV
View PDF2 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Furthermore, from the perspective of video processing, most of the existing technologies ignore the correlation between video frames. When performing illumination estimation frame by frame, there are inevitably differences in the estimated information of a single frame, resulting in the generation of virtual images in the video scene. The discontinuity of the light and shadow effects of objects, that is, there are differences in the consistency of virtual and real lighting and virtual and real shadows between adjacent frames in the video

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for integrating virtual light and real light of video scene
  • Method for integrating virtual light and real light of video scene
  • Method for integrating virtual light and real light of video scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0070] In order to make the purpose, technical solution and advantages of the present invention clearer, a method for fusion of virtual and real light in a video scene according to the present invention will be explained below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0071] The present invention proposes a virtual and real lighting fusion method for video scenes based on inter-frame correlations. On the premise of maintaining the correlation between video frames, the lighting parameters of video key frames are estimated, and non-key frames are corrected by using the lighting parameters of key frames. Lighting parameters, generate realistic virtual and real lighting effects based on video, and complete virtual and real lighting fusion of video scenes.

[0072] Such as figure 1 As shown, a virtual and re...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for integrating virtual light and real light of a video scene. The method comprises the following steps of: extracting video key frames from videos according to equal time intervals, establishing each clue by using the sky, the ground, and a vertical surface in images of the video key frames as clues respectively, then calculating and deducing to obtain a probability distribution map of the sun position, deducing to obtain the possibility distribution map of the sun position in the scene of the video key frames by using the sub position possibility obtained according to the video sky, the ground and the vertical surface, and generating a sparse radiancy map of the video scene key frames; and correcting an illumination estimation result of video non-key frames by an illumination parameter filtering algorithm by using the illumination estimation result of the video key frames, and integrating the virtual light and the real light of the video scene. By the method, the lighting effect of generating the virtual and real light integration video is effectively smoothed.

Description

technical field [0001] The invention relates to image processing and augmented reality technology, in particular to a virtual and real light fusion method of a video scene. Background technique [0002] Virtual reality is a research field that has been developing continuously in recent years. Through high-tech with computer science and technology as the core, a realistic virtual environment that is highly similar to the real environment in terms of sight, hearing, and touch is generated, so that users can get immersive experience in the virtual environment. Feelings and experiences of the environment. Traditional virtual reality technology mainly emphasizes virtual scene modeling and virtual scene performance, and seldom directly integrates the virtual environment into the objective real world, which affects the development and application of virtual reality technology to a certain extent. Augmented reality is a further extension of virtual reality. With the help of necessa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T5/00
Inventor 陈小武赵沁平杨梦夏王珂金鑫
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products