Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Navigation Method Based on Real 3D Map

A navigation method and three-dimensional map technology, applied in the field of navigation, can solve the problems that the characterization is not easy to be accurate and complete, the navigation mode cannot meet the in-depth local experience, and cannot meet the navigation needs, etc., and achieves the effect of eliminating the time of the video.

Active Publication Date: 2022-05-03
TERRA DIGITAL CREATING SCI & TECH (BEIJING) CO LTD
View PDF13 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Current navigation methods often fail to meet these destinationless navigation needs
In addition, users' navigation based on travel notes or other people's planned routes cannot meet their individual needs.
[0005] Even if users have preferences, and there are classifications and calibrations for various user preferences in the navigation map data, these descriptions of user preferences in actual use are often not easy to be accurate and complete, resulting in poor user experience
In addition, the navigation mode based on the shortest distance and fastest mode is usually not enough for a deep local experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Navigation Method Based on Real 3D Map
  • A Navigation Method Based on Real 3D Map

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0026] Embodiment 1: as figure 1 As shown, a navigation method, the navigation method includes the following steps:

[0027] (101) The server sets nodes on the roads of the real-scene 3D map, and generates the first video representing the real-scene image between the two nodes based on the perspective of pedestrians, thereby forming a representation of the real-scene between adjacent nodes in the server database. A plurality of first videos of the video images; the nodes include three kinds of important geographical coordinates, road intersections, and predetermined distances; the first videos are processed into a second video with lower image quality; the first video, the second video The image displayed in each frame corresponds to the geographic location corresponding to this frame of image;

[0028] (102) When the user turns on the navigation device and starts the navigation, set the starting point by setting the user's current location as the starting point or setting th...

Embodiment 2

[0034] Such as figure 2 Shown, embodiment 2: a kind of navigation method, described navigation method comprises the following steps:

[0035] (201) The server sets nodes in the roads of the street view map, and groups the street view photos stored in the map database according to the road nodes, and each street view photo corresponds to the geographic location where it was taken; wherein, the nodes include important Geographical coordinates, road intersections, and predetermined distances; each group of street view photos represents the real street view that can be seen on a section of road between two adjacent nodes, and each group of street view photos is spliced ​​and joined according to the order of geographic location to form a A first video expressing continuous street scenes; processing the first video into a second video with lower image quality; photos displayed in each frame of the first video and the second video, and geographic location data corresponding to this ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A navigation method based on a real-scene 3D map, comprising: (1) a server sets nodes on roads of a real-scene 3D map, and generates a first video representing a real-scene image between two nodes based on the perspective of pedestrians; The first video is processed into the second video with reduced image quality; (2) The user sends location information, expected maximum distance and user preference information to the server; (3) The server generates multiple routes and Multiple segments of the second video are combined to form the third video; (4) The user understands the real scene of each route through the third video, and then selects the real scene image in the third video as the route position; (5) Forms a new route according to the route position. to start navigating. The navigation method is convenient to operate, has realistic scenes, and is suitable for satisfying various individual needs of users in unfamiliar cities.

Description

technical field [0001] The invention relates to the technical field of navigation, in particular to a navigation method based on a real-scene three-dimensional map. Background technique [0002] Navigation is the process of guiding people or things from one geographic location to another. Generally speaking, the setting of its destination is inevitable. [0003] In the prior art, there is a navigation method in which a desired target is selected within a certain range, and then the target is set as a destination, and there is also a navigation method in which one of multiple routes is selected as a navigation path, and there is also a navigation method in which the target is selected. It is a method of planning a route by choosing a route through a certain node after the ground. In addition, existing navigation methods are often based on modes such as the shortest distance and the fastest speed. [0004] Existing navigation usually seldom considers the individual needs of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01C21/36
CPCG01C21/3673G01C21/3647
Inventor 刘俊伟
Owner TERRA DIGITAL CREATING SCI & TECH (BEIJING) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products