Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Depth estimation method combined with semantic edge

A depth estimation and semantic technology, applied in the field of computer vision, can solve the problems of loss reliability, lack of theoretical support, complex network, etc., to achieve the effect of improving accuracy and realizing implicit loss monitoring

Pending Publication Date: 2022-08-09
SHANGHAI INST OF MICROSYSTEM & INFORMATION TECH CHINESE ACAD OF SCI
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The problem with the above methods is that there is no truth value of additional information during inference, so we can only use the trained method to generate the required additional information, thus losing the reliability of using the truth value to guide during inference
The problem with the above method of using dynamic weights is that the weights are forcibly learned from features, lacking theoretical support, and adding a learning layer with more parameters, which makes the network complex
However, the above two methods heavily rely on the accuracy of a given label and are both step-by-step patterns, which may be suboptimal and inefficient

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth estimation method combined with semantic edge
  • Depth estimation method combined with semantic edge
  • Depth estimation method combined with semantic edge

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The present invention will be further described below in conjunction with specific embodiments. It should be understood that these examples are only used to illustrate the present invention and not to limit the scope of the present invention. In addition, it should be understood that after reading the teaching content of the present invention, those skilled in the art can make various changes or modifications to the present invention, and these equivalent forms also fall within the scope defined by the appended claims of the present application.

[0024] Embodiments of the present invention relate to a method for depth estimation combined with semantic edges, comprising the following steps: acquiring an image to be depth-estimated; inputting the image into a trained deep learning network to obtain a depth prediction map and a semantic edge prediction map;

[0025] like figure 1 As shown, the deep learning network includes: a shared feature extraction module, a depth es...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a semantic edge-combined depth estimation method. The method comprises the following steps of: acquiring an image of which the depth is to be estimated; inputting the image into a trained deep learning network to obtain a depth prediction map and a semantic edge prediction map; the deep learning network comprises a shared feature extraction module, a depth estimation module, an edge enhancement weight module, a depth edge semantic classification module and a semantic edge detection module; the shared feature extraction module is used for extracting feature information in the image and transmitting the feature information to the depth estimation module and the semantic edge detection module; the depth estimation module guides parallax smoothing through the semantic edge output by the semantic edge detection module, and performs depth estimation in an image double-reconstruction mode; the edge enhancement weight module forms a feature result needing to be fused by the semantic edge detection module based on the depth edge of the depth prediction map output by the depth estimation module; the depth edge semantic classification module is used for performing depth edge semantic classification prediction; and the semantic edge detection module is used for outputting semantic edge classification prediction of the image. The accuracy can be improved.

Description

technical field [0001] The present invention relates to the technical field of computer vision, in particular to a depth estimation method combined with semantic edges. Background technique [0002] Depth estimation and semantic edge extraction are fundamental problems in computer vision, and the results of their tasks can be deployed in practical sciences such as autonomous driving, virtual reality, robotics, etc., to assist in achieving better results. Depth estimation refers to parsing 3D perceptual information from an image. The semantic edge extraction task is a task that combines image edge extraction and classification tasks, which can simultaneously obtain the semantic information of the edge and the boundary of the object. Currently, the method of deep learning is used. to handle both types of tasks. [0003] Depth estimation is further divided into monocular depth estimation and multi-camera depth estimation. Monocular depth estimation has the advantages of fast p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/50G06T7/13G06V10/26G06V10/44G06V10/764G06V10/80G06V10/82
CPCG06T7/50G06T7/13G06V10/26G06V10/764G06V10/44G06V10/806G06V10/82G06T2207/10028Y02T10/40
Inventor 朱冬晨吴德明张广慧石文君李嘉茂王磊张晓林
Owner SHANGHAI INST OF MICROSYSTEM & INFORMATION TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products