Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Three-dimensional ground object automatic extraction and scene reconstruction method

An automatic extraction and scene reconstruction technology, applied in scene recognition, 3D modeling, instruments, etc., can solve problems such as difficult to implement and poor flexibility

Inactive Publication Date: 2017-12-29
FUDAN UNIV
View PDF3 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The former is less flexible and difficult to implement in practical applications, for example, it is difficult to obtain the actual battlefield environment in advance
The latter has good data flexibility based on model simulation, but the difference between the simulated electromagnetic environment and the actual is determined by the modeling method and accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional ground object automatic extraction and scene reconstruction method
  • Three-dimensional ground object automatic extraction and scene reconstruction method
  • Three-dimensional ground object automatic extraction and scene reconstruction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] In the present invention, the land cover classification uses map data and land cover type data for classification. Using nearest neighbor classification to classify pixels in map data using map data with feature labels. The cluster center point of the nearest neighbor algorithm is predetermined, and the color information corresponding to each category of an example of the present invention is given in Table 1, and the map data used in Table 1 is google map data. Using the nearest neighbor classification for the map data, the main targets such as roads, buildings, waters, and airports marked in the map data can be quickly extracted to realize the preliminary extraction of the land cover type, and then use the land cover information to classify the green space marked in the map data, Carry out fine classification of land cover types in areas such as open spaces to achieve high-precision classification of land types. Through the combination of map data and land surface ty...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of geographic information systems and remote sensing, specifically a three-dimensional ground object automatic extraction and scene reconstruction method based on geographic information and remote sensing data. The method mainly comprises a step of ground surface coverage type detection: extracting water areas, buildings, roads and vegetation from map data and optical image data; a step of building detection and height estimation: extracting the geographic information of a building and estimating the height of the building from an optical image; a step of vegetation detection and extraction: detecting the vegetation and carrying out the height estimation through a neural network; and a step of three-dimensional ground surface model building: building a three-dimensional ground surface through a digital elevation map and carrying out the embedding of ground objects. A three-dimensional scene which is automatically reconstructed based on the global geographic information and remote sensing data is real and accurate, can be updated quickly, and can support the VR / AR application (location services and virtual guide), and also can be used as a background environment model for electromagnetic environment simulation.

Description

field of invention [0001] The invention belongs to the technical field of geographic information systems and remote sensing, and in particular relates to a method for automatic extraction of three-dimensional ground features and scene reconstruction based on geographic information and remote sensing data. Background technique [0002] In the era of big data, various geographic information data such as online maps, satellite images, and remote sensing data provide multi-source information on the global surface. Using these data, we can develop methods for automatic extraction of 3D ground features and scene reconstruction. Object extraction and scene reconstruction have important applications in many fields, such as virtual reality / augmented reality, electromagnetic environment simulation, etc. The automatically reconstructed 3D scene from global geographic information data and remote sensing data can be used to support virtual reality / augmented reality applications such as l...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06T17/05
CPCG06T17/05G06V20/13G06V10/44
Inventor 徐丰李索王海鹏
Owner FUDAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products