Real-time depth completion method based on pseudo depth map guidance

A depth map and depth technology, applied in the field of real-time depth completion based on pseudo-depth map guidance, can solve the problems of multi-data annotation of pre-training model, increase of computing resources, increase of running time of single-frame depth map, etc., to achieve high robustness performance, improved accuracy, depth boundaries, and well-structured effects

Active Publication Date: 2021-05-28
ZHEJIANG UNIV
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these methods have many shortcomings: the introduction of pre-training models requires more data annotation, complex post-processing networks will increase the running time of single-frame depth maps, and the introduction of 3D convolution will also increase computing resources.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time depth completion method based on pseudo depth map guidance
  • Real-time depth completion method based on pseudo depth map guidance
  • Real-time depth completion method based on pseudo depth map guidance

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0086] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0087] Such as figure 1 Shown in the flow chart of the present invention, according to the embodiment that the complete method of the present invention implements and its implementation process are as follows:

[0088] Taking the known dataset of KITTI Depth Completion as the known dataset and the completion of the sparse depth map as an example, the idea and specific implementation steps of the depth completion guided by the pseudo-depth map are described.

[0089] Both the sparse depth map and the ground-truth depth map in the embodiment are from the known data set of KITTI Depth Completion.

[0090] Step 1: Using KITTI Depth Completion known data set division, the training set contains 138 sequences, and the verification set includes 1000 pictures extracted from 13 sequences. There is no intersection between the training set and the validation set...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a real-time depth completion method based on pseudo depth map guidance. The method includes: enabling an RGB camera to collect an RGB image, and enabling a laser radar to detect to obtain a sparse depth image; performing data processing of morphological operation on the sparse depth map to generate a pseudo depth map; establishing a neural network structure, and processing the pseudo depth map, the RGB map and the sparse depth map to obtain a dense depth map; training the neural network structure, setting total loss function optimization supervision, and obtaining parameter values of all parameters in the neural network structure; and loading the parameter values of all parameters into the neural network structure, inputting a single-frame sparse depth map to be measured into the neural network structure, and outputting a dense depth map. According to the method, the precision of depth completion is effectively improved, wrong pixels of a sparse depth map can be corrected, structural similarity supervision is provided, the depth residual image is regressed during prediction, and a high-precision depth completion result can be obtained under a high real-time condition.

Description

technical field [0001] The present invention relates to a depth complement method based on laser radar and RGB camera in the technical field of unmanned vehicles and robot visual perception, in particular to a real-time depth complement method guided by a pseudo-depth map. Background technique [0002] Depth completion refers to the technology of projecting the point cloud collected by lidar into the sparse depth map formed by RGB image space to complete, so as to obtain a dense depth map with the same data density as the corresponding RGB image. It is the key technology for complex systems such as unmanned driving and autonomous robots to efficiently perceive the three-dimensional environment. Lidar is a common distance sensor, and the collected point cloud provides accurate 3D information, but the depth map projected by the original Lidar point cloud is very sparse, only about 3% to 4% of the dense image compared with the medium resolution. There is a depth value on the p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06T7/55G06N3/04G06N3/08G06T5/00G06T5/30
CPCG06T7/55G06N3/08G06T5/002G06T5/30G06T2207/10028G06T2207/10024G06T2207/10044G06T2207/20032G06V20/56G06V10/44G06V10/56G06N3/045G06F18/22G06F18/253
Inventor 项志宇顾佳琦
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products