Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor live-action navigation method and system

A navigation method and technology of a navigation system, applied in the field of indoor positioning, can solve problems such as confusion, inability to confirm one's own position, difficulty in understanding electronic navigation maps, etc.

Inactive Publication Date: 2016-03-02
SHENZHEN UNIV
View PDF13 Cites 79 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In view of the deficiencies in the prior art, the purpose of the present invention is to provide an indoor real-scene navigation method and system, aiming at providing an indoor real-scene navigation method and system based on an intelligent mobile terminal, aiming at overcoming pedestrians’ perception of electronic navigation in the traditional indoor navigation process. It is difficult to understand the map, it is easy to be confused, and you cannot confirm your own position, etc.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor live-action navigation method and system
  • Indoor live-action navigation method and system
  • Indoor live-action navigation method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] In order to make the object, technical solution and effect of the present invention more clear and definite, the present invention will be further described in detail below. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0048] The present invention provides a flow chart of a preferred embodiment of an indoor real scene navigation method, as shown in figure 1 As shown, the method includes:

[0049] Step S100, pre-collect the indoor panoramic image and the position coordinates of the indoor panoramic image, establish and store the topological relationship between the indoor panoramic images, and generate an indoor panoramic image database;

[0050] Step S200, the mobile terminal collects the real scene image of the current location, matches it with the first panoramic image stored in the indoor panoramic image database, and obtains the initial coordinates of the cur...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an indoor live-action navigation method and system. The method comprises the steps that a mobile terminal collects live-action images of the current position, the live-action images are matched with a first live-action image stored in an indoor live-action image database, and an initial coordinate of the current position is obtained; the position of a pedestrian is positioned and tracked in real time through a multi-source sensor fused position algorithm, and when the mobile terminal detects that the position of a user changes, a real-time position result and a walking trajectory are output; a second live-action image, closest to the indoor position result, in the indoor live-action image database is obtained according to the real-time indoor position result; the mobile terminal obtains a starting point and an ending point which are input by the user in the navigation process, a forward direction is obtained, a direction guided arrow is added on the second live-action image, and live-action navigation is conducted. By means of the indoor live-action navigation method and system, according to the position where the pedestrian is located, the live-action navigation images can be pushed automatically, the best navigation image visual angle can be judged automatically, and route guidance information such as the arrow is overlapped on the navigation images.

Description

technical field [0001] The invention relates to the technical field of indoor positioning, in particular to an indoor real-scene navigation method and system. Background technique [0002] With the development of smart mobile terminals and wireless network technologies, more and more people use smart phones for positioning and navigation. Faced with a wide range of pedestrian navigation needs, mobile terminal pedestrian navigation applications have received extensive attention from the mobile phone industry and location-based services, and have become one of the key applications in the mobile phone industry. [0003] Pedestrian navigation refers to the technology and method of using electronic maps, live images and other guiding information to help pedestrians reach their destinations on the basis of real-time positioning. The current pedestrian navigation system still mainly stays in the path planning and path guidance using the flat map. According to the starting point an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/20
CPCG01C21/206
Inventor 张星刘涛李清泉
Owner SHENZHEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products