A method for constructing depth image super-resolution reconstruction network based on color image guidance

A technology of super-resolution reconstruction and network construction, which is applied in the field of super-resolution reconstruction network construction based on color image guidance, can solve the problems of multiple texture areas and negative effects, and achieve high-quality and high-resolution effects

Active Publication Date: 2021-10-29
HANGZHOU DIANZI UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] At present, a large number of traditional depth image super-resolution algorithms are based on the color image-guided reconstruction of the same scene. The method of obtaining high-resolution and high-quality color images is relatively mature, but compared with depth images, color images have more texture areas. Color image-guided depth image reconstruction using ordinary convolutional neural networks may have negative effects, so it is necessary to build a network with strong feature extraction capabilities

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method for constructing depth image super-resolution reconstruction network based on color image guidance
  • A method for constructing depth image super-resolution reconstruction network based on color image guidance
  • A method for constructing depth image super-resolution reconstruction network based on color image guidance

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The following is attached figure 1 The present invention is described further, and the present invention comprises the following steps:

[0033] Step (1): Use the RGB-D camera to obtain the color image and depth image of the same scene

[0034] Using an RGB-D camera to get a low-resolution depth image I depth Its resolution is M*N, and a high-resolution color image I under the same viewing angle color Its resolution is rM*rN, where r is the magnification, and M and N are the height and width of the image respectively. For low-resolution depth images I depth The bicubic upsampling operation is performed to enlarge to rM*rN, and the initial low-quality high-resolution depth image is obtained and put the color image I color Convert to YCbCr color space, and take the Y channel image to get

[0035] Step (2): Construction of a dual-branch image feature extraction structure based on convolutional neural network. In the image feature extraction stage, the two branches...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method for constructing a depth image super-resolution reconstruction network based on color image guidance. At present, more and more scene applications require high-quality and high-resolution depth images. Traditional depth image super-resolution methods are not as effective and fast as convolutional neural network-based methods, but most super-resolution convolutional neural network frameworks They all just stack single-size convolution kernels, which cannot extract favorable guiding information from high-resolution color images in the same scene. The multi-scale convolution kernel color image guidance network of the present invention can not only fully explore the high-resolution color image information, extract useful feature images, but also enrich the diversity of depth image features, and fuse these information to perform super-resolution on depth images rate reconstruction. And the idea of ​​recursive learning and residual learning is used to reduce the burden of the convolutional neural network, control the parameters of the network, and improve the reconstruction effect of the depth image.

Description

technical field [0001] The invention belongs to the field of computer vision, and in particular relates to a method for constructing a depth map super-resolution reconstruction network guided by a color map. Background technique [0002] With the development of technologies for obtaining depth information, such as lidar, time-of-flight (TOF) cameras, 3D structured light, depth images have been widely used in mobile robots, human-computer interaction, human pose estimation and 3D scene reconstruction, etc. However, the depth images obtained from these techniques still cannot meet the practical needs, especially the obtained depth images have low resolution. Therefore, how to reconstruct low-resolution depth images into high-quality and high-resolution depth images has become a research hotspot in the field of computer vision. [0003] In recent years, with the development of deep learning, more and more super-resolution methods based on convolutional neural networks have bee...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T5/50
CPCG06T5/50G06T2207/10024G06T2207/10028G06T2207/20084G06T2207/20221
Inventor 杨宇翔曹旗高明煜何志伟吴占雄
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products