Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Contour Detection Method for Indoor Scenes by Fusion of Color and Depth Information

An indoor scene and contour detection technology, which is applied in character and pattern recognition, image analysis, instruments, etc., can solve problems such as contour detection results errors, and achieve the effect of robust and accurate scene contours

Inactive Publication Date: 2020-05-19
HUAZHONG UNIV OF SCI & TECH
View PDF1 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For each pixel, a posterior probability between 0 and 1 is obtained, and then the watershed transform algorithm is used to convert these inputs into closed areas, and there is a problem in this way, that is, there are strong boundaries near some weak boundaries, which are affected by them. The boundary is also misjudged as a contour, resulting in an error in the final contour detection result

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Contour Detection Method for Indoor Scenes by Fusion of Color and Depth Information
  • A Contour Detection Method for Indoor Scenes by Fusion of Color and Depth Information
  • A Contour Detection Method for Indoor Scenes by Fusion of Color and Depth Information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0051] Overall structure diagram of the present invention sees attached figure 1 , the indoor scene contour detection method that combines color and depth information includes three important parts, namely color image contour extraction, depth image contour extraction and gradient pyramid fusion. The method flow is as follows:

[0052] (1) Separate the color image into three channels, perform ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an indoor scene outline detection method fusing color and depth information and belongs to the technical field of machine vision. The method comprises the key techniques of 1,separating a color image into R (red), G (green) and B (blue), using an image edge detection operator to extract edge information on each channel, and performing weighted fusing to obtain overall results of color image edge detection; 2, using a depth image to extract edge information, correcting false edges due to the loss of three-dimensional information of a scene only during detection of the color image, and enhancing confidence of true edge detection results; 3, constructing a gradient pyramid, and performing multi-scale multi-level fusing on color image edges and depth image edges to obtain edge detection results, and executing edge aggregation to obtain a final outline. The method of the invention has the advantages that gradient information of a depth image is fully excavated, outline detection results of a color image are corrected and enhanced via the outline detection results of the depth image, and a more robust and precise scene outline is obtained accordingly.

Description

technical field [0001] The invention belongs to the technical field of machine vision, and more specifically relates to an indoor scene contour detection method that combines color and depth information. Background technique [0002] In recent years, technologies or applications such as intelligent robots, virtual reality, and drones have become more mature and are rapidly integrating into people's daily production and life. These applications rely on various technologies in the field of machine vision, including object detection and recognition, scene classification, etc., and the more basic one is to extract image edge information to form contours, which is one of the prerequisites for smart devices to understand scenes. How to obtain more accurate and human-perceived object outlines from two-dimensional images acquired by sensing devices is currently a research hotspot in academia and industry. Among them, indoor robots, which are more closely related to human life, are ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/13G06T7/50G06T7/90G06K9/46
Inventor 郭红星潘澧卢涛夏涛孙伟平范晔斌
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products