Incremental remote sensing image three-dimensional reconstruction method based on space occupancy probability fusion

A space occupation and remote sensing image technology, applied in image analysis, image enhancement, image data processing, etc., can solve problems such as inability to optimize building models and low precision of 3D building structure features, and achieve the effect of saving manpower and material resources

Active Publication Date: 2021-07-13
HARBIN ENG UNIV
View PDF10 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention aims to solve the problem that when a single image is used for three-dimensional reconstruction, the building model obtained before reconstruction cannot be optimized by adding the image of the building subsequently, resulting in low accuracy of the structural features of the three-dimensional building

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Incremental remote sensing image three-dimensional reconstruction method based on space occupancy probability fusion
  • Incremental remote sensing image three-dimensional reconstruction method based on space occupancy probability fusion
  • Incremental remote sensing image three-dimensional reconstruction method based on space occupancy probability fusion

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0017] Specific implementation mode one: refer to figure 1 Specifically explaining this embodiment, the three-dimensional reconstruction method of an incremental oblique remote sensing image based on spatial occupancy probability feature fusion described in this embodiment includes:

[0018] Step 1. From multiple remote sensing images containing multiple buildings, select all building target images to be reconstructed, and clip each target building image as a separate target to obtain a single building under multiple different angles Object remote sensing images, based on the existing model data, use modeling tools to manually model each target building to obtain a building model that corresponds to each building one by one, and combine all building remote sensing images with the corresponding image The building model is used as the training data; the shape center of the building should be in the center of the image and the proportion of the building in the image should be mor...

specific Embodiment approach 2

[0023] Embodiment 2: The difference between this embodiment and Embodiment 1 is that the expression of calculating 3D-IoU in the step 4 is as follows:

[0024]

[0025] Among them, A and B are the space occupancy models of the building.

[0026] Other steps and parameters are the same as those in Embodiment 1.

specific Embodiment approach 3

[0027] Embodiment 3: The difference between this embodiment and Embodiment 1 or 2 is that in step 4, all space occupancy models are fused with three-dimensional information. When merging, the fusion weight is optimally set, and the building can be obtained according to each image. The angle information and position information of each sampling point are assigned the corresponding weight μ to the space occupancy probability of each sampling point, and the obtained sampling point space occupancy probability is calculated by the following formula:

[0028] P b =μ 1 P 1 +μ 2 P 2 +…+μ n P n

[0029] Among them, P b Indicates the space occupancy probability after model fusion; μ n Indicates the weight corresponding to the nth model space occupancy probability, μ 1 +μ 2 +,...,μ n =1.

[0030] Other steps and parameters are the same as those in Embodiment 1 or Embodiment 2.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an incremental remote sensing image three-dimensional reconstruction method based on space occupancy probability fusion, and belongs to the technical field of remote sensing image three-dimensional reconstruction. The method aims at solving the problem that when three-dimensional reconstruction is carried out through a single image, a building model obtained through reconstruction cannot be optimized by increasing the image of the building subsequently, and the obtained three-dimensional building structure feature precision is low. The method comprises: obtaining a to-be-reconstructed building target image and a building model corresponding to each building; inputting the image and the model into an Onet single image reconstruction network for training to obtain a reconstruction network parameter model; inputting a to-be-reconstructed target building image into the model, obtaining the boundary of the space occupation model, and forming the space occupation model of the building; performing three-dimensional information fusion on all the space occupation models to obtain a final space occupation model; generating a three-dimensional surface grid, obtaining a three-dimensional model of the building, and achieving three-dimensional reconstruction of the building. The method is used for three-dimensional building reconstruction.

Description

technical field [0001] The invention relates to a method for three-dimensional reconstruction of incremental oblique remote sensing images based on spatial occupancy probability feature fusion. The invention belongs to the technical field of three-dimensional reconstruction of remote sensing images. Background technique [0002] With the improvement of my country's remote sensing technology and the strengthening of remote sensing data acquisition capabilities, the 3D model reconstruction technology of buildings based on remote sensing information has become an important part of digital military battlefield simulation. Research on the reconstruction technology of buildings, using high-resolution remote sensing images acquired by remote sensing platforms such as satellites, aviation aircraft, and drones, to quickly, detailedly and accurately acquire and reconstruct 3D geographic information on the battlefield will help build a transparent and detailed The real and real digita...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T17/00G06T7/33G06T7/10G06T5/50
CPCG06T17/00G06T7/344G06T7/10G06T5/50G06T2207/10032G06T2207/20081G06T2207/20221G06T2207/20132
Inventor 闫奕名曹振赫宿南冯收赵春晖
Owner HARBIN ENG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products