Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A three-dimensional wire frame structure method and system fusing a binocular camera and IMU positioning

A binocular camera and camera coordinate system technology, applied in surveying and navigation, image enhancement, instruments, etc., can solve problems such as inability to describe the environment, loss of key structural information, and high calculation costs

Active Publication Date: 2019-01-08
WUHAN UNIV
View PDF14 Cites 88 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

On the one hand, the construction of sparse maps based on feature points loses key structural information in the environment and cannot form an effective description of the environment
The construction of dense maps requires processing a huge amount of point cloud data, which is expensive to calculate
On the other hand, only relying on a single visual sensor cannot meet the robustness and stability requirements of the system in some complex scenes (such as illumination changes, occlusion, highlights, repeated textures, and independent moving objects, etc.)
At present, there is no system capable of reconstructing environmental structural information with high efficiency and high precision in complex environments.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A three-dimensional wire frame structure method and system fusing a binocular camera and IMU positioning
  • A three-dimensional wire frame structure method and system fusing a binocular camera and IMU positioning
  • A three-dimensional wire frame structure method and system fusing a binocular camera and IMU positioning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0120] The technical solutions of the present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0121] The embodiment of the present invention provides a positioning and three-dimensional wireframe structure reconstruction system that fuses a binocular camera and an IMU, including the following modules:

[0122] The data acquisition module reads and preprocesses the low-frequency image stream captured by the binocular camera and the high-frequency data collected by the accelerometer and gyroscope in the inertial measurement unit;

[0123] The feature extraction and matching module, through the extraction and matching of the feature points of the left and right images, calculates the disparity, restores the three-dimensional position of the point in the camera coordinate system; extracts the key frames in the image stream and the features of the straight line segments in the key frames, and then based on the local features...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a three-dimensional wire frame structure method and system fusing a binocular camera and IMU positioning. On the basis of binocular vision, the invention initializes and fusesinertial measurement information by using a divide-and-conquer strategy, implements tracking, positioning and drawing, and can robustly run in indoor and outdoor environments and complex motion conditions. On the basis of accurate positioning, 3D wireframe reconstruction and iterative optimization are carried out based on the posture of the key frame. Linear segments are matched by local featuresand spatial geometric constraints and back-projected into three-dimensional space. Through the angle and distance constraints, the straight line segments are divided into different sets. Based on thegrouping results, the fitting region is determined and the straight line segments are merged. Finally, a 3-D wireframe structure is output. The invention fuses multi-source information to improve thestability and robustness of the system on the traditional vision-based positioning and mapping method. At the same time, line information is added to the key frame to sparsely express the structuralcharacteristics of the three-dimensional environment, which improves the computational efficiency.

Description

technical field [0001] The invention belongs to the field of computer vision, and in particular relates to a method and system for merging positioning of a binocular camera and an IMU and reconstructing a three-dimensional wireframe structure. Background technique [0002] Vision-based simultaneous localization and mapping is an important research topic in the field of computer vision. It means that the vehicle is equipped with a specific visual sensor, which can perceive and describe the surrounding environment and estimate its own motion without prior environmental information. The complete visual positioning and mapping system can be divided into five parts: acquisition of sensor data, front-end visual odometer, back-end nonlinear optimization, loop detection and mapping. Compared with other measurement methods, the visual measurement method has the advantages of high precision, high efficiency, low cost, and simple system structure, and is widely used in robot navigatio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/73G06T5/00G01C21/16G01C21/20
CPCG06T7/73G01C21/16G01C21/20G01C21/206G06T2207/10012G06T5/80
Inventor 王星博石瑞星
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products