Dense reconstruction method of UAV scene based on vi-slam and depth estimation network

A VI-SLAM and depth estimation technology, applied in the field of virtual reality, can solve the problem that large-scale scenes cannot be reconstructed quickly and densely, and achieve the effects of good generalization ability, fast operation efficiency and high accuracy

Active Publication Date: 2022-06-07
BEIHANG UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The technology of the present invention solves the problem: overcomes the problem of the prior art that large-scale scenes cannot be quickly and densely reconstructed, and provides a dense reconstruction method for unmanned aerial vehicle scenes based on VI-SLAM and depth estimation network, by tracking the camera pose in real time, and Estimate the depth information of the scene through the depth estimation of a single image, which can achieve faster operating efficiency during heavy and dense construction.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dense reconstruction method of UAV scene based on vi-slam and depth estimation network
  • Dense reconstruction method of UAV scene based on vi-slam and depth estimation network
  • Dense reconstruction method of UAV scene based on vi-slam and depth estimation network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The present invention is described in further detail below in conjunction with the accompanying drawings and implementation examples:

[0061] The basic operation of the UAV scene reconstruction method in the present invention is to use the UAV equipped with IMU components to photograph the three-dimensional environment, transmit the obtained information to the back end for processing, and output the dense reconstruction point cloud rendering of the UAV scene. .

[0062] like figure 1 As shown, the steps of the UAV 3D reconstruction method based on VI-SLAM and depth estimation network of the present invention are as follows:

[0063] (1) Fix the inertial navigation device IMU to the drone, and calibrate the internal parameters, external parameters and IMU external parameters of the drone's own monocular camera;

[0064] (2) Use the UAV monocular camera and IMU to collect the image sequence and IMU information of the UAV scene;

[0065] (3) Use VI-SLAM to process the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention relates to a dense reconstruction method of unmanned aerial vehicle scenes based on VI-SLAM and depth estimation. External parameters and IMU external parameters are calibrated; (2) use the UAV monocular camera and IMU to collect the image sequence and IMU information of the UAV scene; (3) use VI‑SLAM to process the images and Process the IMU information to obtain the camera pose with scale information; (4) Input the monocular image information as the original view into the viewpoint generation network to obtain the right view, and then input the original view and the right view into the depth estimation network to obtain the depth of the image (5) Combine the camera pose obtained in step (3) with the depth map in step (4) to obtain a local point cloud; (6) After point cloud optimization and registration, combine the SLAM tracking trajectory with the local point cloud Cloud fusion to obtain a dense point cloud model of the UAV scene.

Description

technical field [0001] The invention relates to a dense reconstruction method of unmanned aerial vehicle scene based on VI-SLAM (Visual Inertial Navigation Fusion Simultaneous Positioning and Mapping) and depth estimation network, and belongs to the field of virtual reality. Background technique [0002] 3D reconstruction refers to the establishment of a mathematical model suitable for computer representation and processing of 3D objects. It is the basis for processing, operating and analyzing its properties in a computer environment. With the increasing demand for 3D reconstruction and the continuous popularization of UAV aerial photography, point cloud reconstruction based on UAV aerial images has become a research hotspot. [0003] The traditional dense reconstruction method based on depth camera or SfM requires more complex hardware equipment and relatively huge computing resources, which cannot well meet the needs of lightweight equipment and rapid reconstruction in lar...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T17/00G06T7/33G06T7/55G06V10/46
CPCG06T17/00G06T7/33G06T7/55G06V10/464Y02T10/40
Inventor 周忠吕文官温佳伟闫飞虎柳晨辉
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products