Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Navigation method and navigation terminal

A technology of navigation terminal and navigation method, applied in directions such as road network navigators, can solve the problems of monotonous AR information navigation method, unable to display navigation information, unable to select mobile terminals, etc., to improve interactive experience, improve selection time, and save time Effect

Inactive Publication Date: 2017-02-08
SHENZHEN VIEW CULTURE TECH CO LTD
View PDF3 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the embodiments of the present invention is to provide a navigation method, which aims to solve the problem that in the existing augmented reality technology solutions, the mobile terminal cannot select the AR information within the set range and the navigation method is monotonous. During the navigation process, it cannot be three-dimensional, The problem of displaying navigation information dynamically and comprehensively

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Navigation method and navigation terminal
  • Navigation method and navigation terminal
  • Navigation method and navigation terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] reference figure 1 , figure 1 It is an implementation flowchart of a navigation method provided by an embodiment of the present invention, and the details are as follows:

[0040] In step S101, a real scene image is taken, and the geographic coordinates of the geographic location and the range value manually selected in the electronic fence are obtained;

[0041] Wherein, the real scene image is an image of a real scene captured by the lens in real time.

[0042] When the handheld terminal is a mobile terminal, the real scene image is an image in which the lens in the mobile terminal captures the real scene in real time.

[0043] For ease of description, taking a mobile terminal as an example, after the mobile terminal turns on the camera device, the camera device captures a real scene image.

[0044] The mobile terminal obtains the latitude, longitude and altitude through the GPS, obtains the direction the user is facing at the time through the compass sensor, and obtains the ti...

Embodiment 2

[0077] figure 2 It is the implementation flowchart of step S105 of the navigation method provided by the embodiment of the present invention, and the details are as follows:

[0078] S201: When the direction of the actual traveling route changes, obtain the angle of the change;

[0079] S202: Control the navigation pet to perform a head-turning motion and twist the angle to change the orientation of the navigation pet in real time.

[0080] In the embodiment of the present invention, controlling the navigation pet to perform the head-turning action can intuitively actually travel the direction of the route, reducing the time for the user to determine the direction, and improving the navigation efficiency.

Embodiment 3

[0082] image 3 It is the implementation flow chart of the navigation pet performing the waiting in place action provided by the embodiment of the present invention. The details are as follows:

[0083] S301: Detect whether the position of the user actually traveling on the line is in a stationary state;

[0084] S302: When it is detected that the user's position is in a stationary state, call the in-place waiting action;

[0085] S303: Control the navigation pet to perform the waiting action in place until the stationary state disappears.

[0086] In the embodiment of the present invention, the navigation pet is controlled to perform the waiting action in place, and the position of the user who actually travels on the route can be intuitively controlled, which improves the display efficiency.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a navigation method and a navigation terminal, applicable to the technical field of augmented reality. The method comprises the following steps: shooting real-scene images; calling AR information in an electronic corral from an AR information server; superposing and covering the real-scene images with AR information; generating a navigation path between the geographical location of a selected shop in the AR information and a geographical location according to the two geographical locations and generating a navigation pet corresponding to the navigation path according to the preset skeletons, UV maps and specific motions of navigation pets; changing the orientation of the navigation pet in real time when the direction of an actual traveling path changes; and when the distance of the actual traveling path of a user is changed, controlling the distance of the navigation pet changed along with movement in a screen and receiving information pieces transmitted by a dealer terminal located at the geographical location of the shop in real time. According to the invention, bidirectional action is formed between the navigation pet and the user, so a funny and highly efficient interaction mode is formed.

Description

Technical field [0001] The invention belongs to the field of augmented reality technology, and in particular relates to a navigation method and a navigation terminal. Background technique [0002] Augmented Reality (English Augmented Reality, abbreviation: AR), augmented reality technology is a new technology developed on the basis of virtual reality, which can superimpose virtual electronic information in the real world and enhance or expand real world information. To help people engage in various activities. [0003] However, in the solution of the augmented reality technology in the prior art, the AR information is usually superimposed and displayed directly on the same layer. It has the following two problems: [0004] The first aspect: the mobile terminal cannot select AR information within the set range. When there are more AR information, it will increase the user's selection time and reduce the user's operating efficiency; [0005] The second aspect: The navigation method i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/34
CPCG01C21/34
Inventor 李旭升
Owner SHENZHEN VIEW CULTURE TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products