Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Outdoor natural scene illumination estimation method and device

A technology for illumination estimation and natural scenes, applied in the field of image processing, can solve the problems of inability to estimate illumination information, limited illumination expressiveness, and inability to guarantee the demand for diverse data of deep neural networks, so as to improve quality, expand application scope, The effect of reducing difficulty and time

Active Publication Date: 2021-10-29
PEKING UNIV
View PDF7 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0022] The present invention solves the limitation of light expressiveness based on parameter reconstruction in the prior art, which cannot effectively represent the intensity of light taken in all directions, cannot guarantee the needs of the deep neural network for diversified data, and cannot estimate the correspondence of different positions in the picture. Provide a method and device for estimating the illumination of outdoor natural scenes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Outdoor natural scene illumination estimation method and device
  • Outdoor natural scene illumination estimation method and device
  • Outdoor natural scene illumination estimation method and device

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0044] Specific implementation mode 1. Combination figure 1 and figure 2 Describe this embodiment, the outdoor natural scene illumination estimation method, this embodiment proposes the outdoor local illumination estimation technology (Outdoor Local Illumination Estimation), which refers to taking an image of an outdoor scene with a given single low dynamic range and the corresponding scene in the image When the pixel position is determined, the high dynamic range (High Dynamic Range, HDR) lighting information corresponding to the pixel point in the 3D scene can be estimated. In this embodiment, a method of generating a large amount of data based on a large-scale 3D virtual city scene and combining a specially designed deep neural network structure to realize outdoor local illumination estimation is proposed. Generally, complete lighting information capture requires special shooting equipment (such as ND filters, cameras with fisheye lenses, etc.) Shooting settings, this me...

specific Embodiment approach 2

[0068] Specific embodiment two, combine figure 1 and figure 2 Describe this embodiment, this embodiment is the estimating device of the outdoor natural scene illumination estimation method described in the specific embodiment, the device includes a synthetic data acquisition module and a neural network module; the neural network module includes an intrinsic image decomposition module (this Image decomposition network, I-Net), image space to panoramic space mapping module and panoramic illumination information completion module (panoramic illumination completion network, P-Net);

[0069] The synthetic data acquisition module is used to select different camera shooting angles for scene acquisition, and obtain the true value of the intrinsic attribute in the rendering process by extracting the rendering buffer in the rendering process; at the same time, through local illumination collection and data filtering, obtain Compared with the collected real scene images, the rendered i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an outdoor natural scene illumination estimation method and device, and relates to the technical field of image processing. The method solves the problems that in the prior art, the illumination expressive force based on parameter reconstruction is limited, the light intensity shot in all directions cannot be effectively expressed, and the requirement of a deep neural network for diversified data cannot be guaranteed, and illumination information corresponding to different positions in the picture cannot be estimated. The device comprises a synthetic data acquisition module, an intrinsic image decomposition module, an image space to panoramic space mapping module and a panoramic illumination information complementation module. By generating high-quality urban scene illumination acquisition data in batches, outdoor illumination estimation is changed from only global illumination estimation to global illumination and local illumination estimation. By collecting and screening the synthetic data, the data acquisition difficulty and time are greatly reduced, and the application range of the algorithm is expanded. In addition, a deep learning method is adopted, so that the illumination estimation quality is effectively improved.

Description

technical field [0001] The present invention relates to the technical field of image processing, in particular to a method and device for estimating illumination of an outdoor natural scene. Background technique [0002] With the development of computer technology, computer computing power is gradually strengthened, machine learning and deep learning technology are advancing rapidly, and computer vision-related technologies are gradually applied to various scenarios, such as face detection of mobile phone cameras, retouching and beautifying pictures, and taking pictures at night. Pedestrian detection and road recognition in human driving, face recognition for mobile payment and station identity detection, or simultaneous positioning and mapping tasks for robots, etc. With the advent of the era of big data and intelligence, more and more application scenarios need the support of computational vision technology. Massive video and image data need to be processed urgently, which...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N5/232H04N5/235H04N13/122G06N3/04G06N3/08G06T5/50G06T7/00
CPCH04N13/122G06T7/0002G06T5/50G06N3/084G06T2207/10004G06T2207/20208H04N23/64H04N23/80H04N23/698H04N23/71G06N3/045
Inventor 施柏鑫李思朱勇杰
Owner PEKING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products