Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Object illumination moving method based on gradient operation

A lighting migration and gradient technology, applied in the fields of virtual reality and computer vision, can solve the problem of the geometric difference between the target face and the reference face, and achieve the effect of real results.

Inactive Publication Date: 2012-02-22
BEIHANG UNIV
View PDF4 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The limitations of this method are: 1. The assumption that the target face and the reference face have similar complexity are used; 2. The illumination migration result is seriously affected by the geometric difference between the target face and the reference face

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Object illumination moving method based on gradient operation
  • Object illumination moving method based on gradient operation
  • Object illumination moving method based on gradient operation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] The present invention will be described in detail below in conjunction with the accompanying drawings.

[0023] refer to figure 1The main flow chart of the present invention, the object illumination migration method based on the gradient operation of the present invention includes the following basic processes: first, use the active contour model face positioning tool and image deformation method to convert the reference object (that is, the part of the reference object area in the input image) ) is aligned to the target object (that is, the part of the target object area in the input image), and then both the reference object and the target object are decomposed into a luma layer and a chroma layer, and a least squares filter is used to divide the luma layer into a large-scale layer and a In the detail layer, all operations are carried out on the large-scale layer; the present invention first converts the large-scale layer of the reference object and the target object ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an object illumination moving method based on gradient operation, and provides a new technical scheme for the generation of a virtual scene illumination effect based on stock footage. The invention is characterized in that the total process of building an object illumination moving method based on the gradient operation comprises the following steps: generating an illumination moving result through the steps of image aligning, image layering, weighing of least squares filtering, gradient operation, Poisson integral and image combination; improving a least squares filter; self-adaptively calculating the least squares filtering parameter according to the unsmooth content of the image, so as to carry out filtering in higher level in the unsmooth area and save more detail information in a detail layer; and adding gray level constraints in the gradient operation, so as to enable the whole gray level of the result of the illumination moving to be more close to the reference object. The method provided by the invention can be widely used and promoted to the fields such as interactive digital entertainment, movie and television programming, artistic designing and creation and the like.

Description

technical field [0001] The invention relates to the fields of virtual reality and computer vision, in particular to an object illumination migration method based on gradient operations. Background technique [0002] The virtual-real fusion scene generation based on video material is an important part of virtual reality, and it is also a research hotspot in the organic intersection of virtual reality, augmented reality, computer vision and related research directions. Because the video scene and scene objects that make up the virtual scene often come from different video materials, the lighting effects of the scene objects and the video scene may be quite different. However, the video virtual scene requires that each scene object has a consistent lighting effect, but the current The light fusion method of video material is difficult to meet the needs of video virtual scenes. The problem of lighting migration of video scene objects is how to transfer the lighting effect of th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T15/50
Inventor 陈小武赵沁平金鑫陈萌萌吴洪宇
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products