Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for producing high-precision map based on live-action three-dimensional model

A three-dimensional model, high-precision technology, applied in 3D modeling, image data processing, character and pattern recognition, etc., can solve the problems of low precision of DOM road network plane, inability to reflect elevation information, and more manpower consumption. Achieve the effect of breaking the difficult acquisition and low accuracy of the roll angle, improving the accuracy of image recognition and segmentation, and improving automation

Active Publication Date: 2019-10-25
TERRA DIGITAL CREATING SCI & TECH (BEIJING) CO LTD
View PDF13 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The existing road network extraction methods still have quite a few shortcomings: 1. The plane accuracy of the road network extraction based on DOM is not high, and the elevation information cannot be reflected at all; 2. The road network extraction based on stereo pairs requires more manpower, less efficient

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for producing high-precision map based on live-action three-dimensional model

Examples

Experimental program
Comparison scheme
Effect test

example I

[0077] Each pixel in the recognition result is assigned a semantic label and an instance ID;

[0078] The pixels with the same semantic label and instance ID are grouped into the same object, and the elements of the real-world 3D road network model are segmented.

[0079] In this embodiment, the resource pool contains a collection of all elements of the road, including graphic images and individual attributes; the elements of the segmented real-scene 3D road network model are matched one by one with the existing graphic image elements in the resource pool , the matching matching rule is that when the overlap ratio between the segmentation window predicted by the real-world 3D road network model and the original image marker window in the resource pool is greater than 0.5, the two can match, and the matching result is obtained and retained.

[0080] In this embodiment, in step S8, the road elements are classified into three types, which are respectively,

[0081] Lane model; i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method for producing a high-precision map based on a live-action three-dimensional model. The method specifically comprises the following steps: performing view angle transformation on a live-action three-dimensional model to acquire a multi-view-angle image; carrying out road network feature identification by using a multi-view image identification method; adopting an image panoramic segmentation method to segment an identification result, matching elements of a segmented live-action three-dimensional road network mode with existing graphic image elements in a resource pool, carrying out vectorization on a matching result and the segmented live-action three-dimensional road network model elements, and finely classifying reserved road elements; and obtaining a refined classification result, and finally obtaining a high-precision map. The method has the advantages that the problem that field collection is influenced by external conditions such as weather, road conditions and moving objects is solved, and the production cost of a high-precision map is greatly reduced; the live-action three-dimensional model serves as visual mapping of the real world, and while an existing high-precision map production mode is broken through, the precision and detail expression of the map are more perfect.

Description

technical field [0001] The invention relates to the field of high-precision map production, in particular to a method for producing a high-precision map based on a real-scene three-dimensional model. Background technique [0002] With the continuous development of high-resolution remote sensing images and aerial photography acquisition methods, the technology of road network extraction through aerial photography has also made great progress. At present, the road network extraction methods mainly include: road network extraction based on DOM and road network extraction based on stereo image pairs. [0003] The extraction of road network based on DOM includes the following contents: 1. Acquire aerial images; 2. Conduct aerial triangulation; 3. Generate DOM (digital orthophoto map) according to the results of the second step; 4. Manually collect road network according to DOM and sketch. [0004] The extraction of road network based on stereo image pairs includes the following...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/34G06K9/62G06T17/00
CPCG06T17/00G06V20/588G06V20/56G06V10/267G06F18/214
Inventor 刘俊伟黄栋李嘉榆
Owner TERRA DIGITAL CREATING SCI & TECH (BEIJING) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products