Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A depth imaging and information acquisition method based on binocular vision

A technology of depth information and binocular vision, applied in image analysis, image enhancement, image data processing, etc., can solve problems such as difficulty in adapting to two-dimensional parallax, and achieve a wide range of applications

Active Publication Date: 2021-07-13
魏运 +1
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a depth imaging and information acquisition method based on binocular vision, which can overcome the problem that the existing binocular depth imaging algorithm is difficult to adapt to the existence of two-dimensional parallax, and improve the parallax calculation ability of the algorithm itself. In the presence of parallax, the network map, weight and label update strategy can calculate a more accurate parallax value, and combine the camera erection angle, height information and GPS data to realize the physical scale calculation of the target in the scene and the acquisition of world position information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A depth imaging and information acquisition method based on binocular vision
  • A depth imaging and information acquisition method based on binocular vision
  • A depth imaging and information acquisition method based on binocular vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] Below in conjunction with accompanying drawing and specific embodiment the technical scheme is further described as follows:

[0033] like Figure 4 As shown, the process of depth imaging and information acquisition method based on binocular vision is described. Firstly, color image acquisition is performed to obtain a color image of a monitoring area, a suitable two-dimensional parallax range is set, and the maximum flow monitoring value is initialized. According to the label Set rules, construct a two-dimensional disparity label set, iteratively select a pair of two-dimensional disparity label combinations to construct a network graph, and design edge weights, then perform maximum flow algorithm optimization, calculate the network maximum flow and minimum cut results, and judge Whether the maximum flow has decreased compared with the maximum flow monitoring value. If the judgment is no, the maximum flow monitoring value update and label update are not performed. If ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a depth imaging and information acquisition method based on binocular vision. The method performs spatial calibration on images captured by binocular cameras through distortion correction and epipolar line correction, and constructs a two-dimensional parallax label set based on calibrated image pairs. , select different label combinations as source-sink points each time to construct a network graph and design edge weights, use the maximum flow algorithm to iteratively obtain the minimum cut set of the network, and complete the two-dimensional disparity label assignment of pixels according to the designed label update strategy, When the iteration is terminated, a relatively dense disparity map can be obtained, and the depth information of the scene can be calculated by combining the principle of triangulation ranging, and the calculation result can be further optimized by collecting the depth information of key points. Both location and physical scale information of an object can be obtained.

Description

technical field [0001] The invention belongs to the field of depth imaging, and in particular relates to a binocular vision-based depth imaging method and an information acquisition method. Background technique [0002] Depth imaging is a technology that uses imaging equipment to extract scene depth information and express the depth information as a depth image. This technology can be combined with target detection, target recognition, image segmentation and other technologies to be applied in intelligent video surveillance, driverless cars, intelligent transportation, security and robot automatic control and other fields. In actual scenarios, it can be used for pedestrian detection in important places such as subways, stations, squares, shopping malls, parks, and crowded areas. The present invention innovatively proposes a binocular depth imaging method in the presence of two-dimensional parallax, combined with an intelligent three-dimensional stereo monitoring camera, opt...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/80G06T7/593G06T7/70
CPCG06T2207/10012G06T2207/10028G06T2207/20228G06T2207/30208G06T2207/30244G06T7/593G06T7/70G06T7/85
Inventor 魏运田青仝淑贞
Owner 魏运
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products