Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Depth calculating method under haze environment

A technology of depth calculation and haze, applied in the field of computer vision, can solve problems such as unsatisfactory capture of distant scenery, depth calculation error, and inability to recognize the outline of distant scenery

Inactive Publication Date: 2017-08-11
TIANJIN UNIV
View PDF2 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these methods have large errors in depth calculation in outdoor haze scenes.
In addition, for a single image in a haze scene, there are many methods to start with the atmospheric transmission rate, estimate the atmospheric transmission rate, and finally estimate the depth map based on the atmospheric scattering mathematical model, such as He et al.(K.He,J. Sun, and X.Tang.Single image haze removal using dark channel prior.IEEE Transactions on Pattern Analysis and Machine Intelligence,33(12):2341–2353,2010.2,3,5,6), although this method can calculate the Depth information, but retains too much texture detail information, which causes a large depth error for the depth map; Berman et al. (D.Berman, T.treibitz, and S.Avidan.Non- local image dehazing.In The IEEE Conference on Computer Vision and Pattern Recognition(CVPR),June 2016.2,3,5,6,7,8) and Chen et al.(C.Chen,M.N.Do,and J.Wang.Robustimage and video dehazing with visual artifact suppression via gradient residual minimization. In European Conference on Computer Vision (ECCV), 2016.2, 3, 5, 6, 7, 8), these methods are good at estimating the atmospheric transmission rate, and in the performance of the depth map It is smoother, but the capture of distant scenery in the image is not satisfactory, and the outline of distant scenery cannot be recognized.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth calculating method under haze environment
  • Depth calculating method under haze environment
  • Depth calculating method under haze environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065] In order to reduce cost and complexity, and facilitate detection, the present invention adopts a depth calculation method in haze environment. The atmospheric transmission rate is initialized through the dark channel method, and bilateral filtering is performed on it through a norm constraint to optimize the atmospheric transmission rate. Finally, the atmospheric scattering model is used to calculate the depth map. Through the method of the invention, the user can directly collect images through a mobile phone or a single camera, and finally obtain a depth map of the current foreground. This invention has broad application prospects in fields such as three-dimensional reconstruction and unmanned driving.

[0066] A depth calculation method under a haze environment of the present invention is characterized in that comprising the following steps:

[0067] 1) Use a hand-held camera or an image acquisition device to take a color picture of the current scene;

[0068] 2) Ut...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of computer vision and the optimal method, and provides a depth calculating method under a haze environment. The method can more accurately calculate the depth map of a scene in a single image, and even calculate the depth information of the scene at a farther place. To achieve the abovementioned object, the technical scheme is adopted that the depth calculating method under the haze environment comprises steps of 1) capturing the color map of a current scene with a hand-held camera or an image acquisition device; 2) using the color map collected in step 1) to estimate the PM2.5 value of the current scene; 3) according to the color map collected in the step 1), calculating the atmospheric transmission rate t (x) in the current scene; and 4) based on the PM2.5 value obtained by the step 2) and the atmospheric transmission rate t (x) in the current scene calculated in the step 3), finally obtaining the depth data of the current scene. The method is mainly applied to computer vision processing occasions.

Description

technical field [0001] The invention belongs to the field of computer vision and an optimization method, in particular, relates to a depth calculation method in a haze environment. Background technique [0002] The development of science and technology has greatly improved people's quality of life, and people have higher and higher requirements for 3D technology, so more and more researchers are engaged in the calculation of depth maps. From skeleton restoration, 3D reconstruction of cultural relics, distance judgment of unmanned distant scenery, etc., all reflect the importance of calculating the depth map. However, in the field of computer vision, computing depth maps is still a very challenging topic. In traditional methods, binocular cameras, such as Microsoft's Kinect computing camera, are extremely convenient for computing depth maps, but their equipment is expensive and costly, which is not conducive to industrial use. For the depth estimation of a single image, mos...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/50
Inventor 李坤马健杨敬钰韩亚洪
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products