Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Augmented reality navigation method based on indoor natural scene image deep learning

A technology of natural scene images and natural scenes, which is applied in the field of augmented reality navigation based on deep learning of indoor natural scene images, and can solve the problem that markers such as QR codes are easily damaged by human beings, augmented reality navigation is difficult to popularize, and positioning accuracy is low. problems, to achieve the effect of improving planning efficiency, making it difficult to find, and having a strong sense of reality

Active Publication Date: 2020-05-08
LUDONG UNIVERSITY
View PDF15 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, deep learning of natural scene images and indoor augmented reality navigation on smartphones is still a difficult and open problem.
[0005] The above-mentioned various indoor positioning and navigation technologies either locate the local position through various signal inductions, or need to arrange artificial markers, and there are problems such as unstable signal sources, low positioning accuracy, and high failure rate; at the same time, such as indoor The method of placing a specific QR code has the problems of low real-time performance and poor interactivity. On the other hand, markers such as QR codes are easily damaged by human beings, which increases the difficulty of registration and navigation.
In view of the above factors, it is difficult to popularize indoor augmented reality navigation based on natural scene images.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Augmented reality navigation method based on indoor natural scene image deep learning
  • Augmented reality navigation method based on indoor natural scene image deep learning
  • Augmented reality navigation method based on indoor natural scene image deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0082] The specific implementation manners of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0083] In this embodiment, a smart phone with eight cores and 6G memory is used, and the resolution of the camera is 1920*1080, and the internal parameters are calibrated in advance, and the default is unchanged; the feature points of the indoor natural scene appearing in the camera of the mobile phone are identified and tracked register;

[0084] Such as figure 1 As shown, an augmented reality navigation method based on deep learning of indoor natural scene images includes the following steps:

[0085] Step 1: If figure 2 as shown, figure 2 In order to establish the flow chart of the 3D scene feature recognition point information library of indoor natural scenes, according to the basic principle of 3D reconstruction of indoor natural scenes based on 3D scene feature recognition points, use a 3D laser scanner to scan indoor...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an augmented reality navigation method based on indoor natural scene image deep learning. The method comprises the steps of firstly scanning an indoor natural scene through a three-dimensional laser scanner to extract three-dimensional scene feature recognition points, then calculating an internal reference matrix of a smart phone camera, collecting an indoor natural sceneimage through a smart phone to extract two-dimensional image feature recognition points, and establishing an indoor natural scene topological network structure chart through an indoor planar map; binding and mapping the two-dimensional image feature recognition points, the three-dimensional scene feature recognition points and the topological network path nodes through specific descriptors; carrying out deep learning-based image classification on the indoor natural scene images acquired by the smart phone, and segmenting the indoor natural scene into a plurality of sub-scenes; and then tracking and recovering the three-dimensional scene feature recognition points by using an optical flow tracking algorithm, further synthesizing a three-dimensional registration matrix required by scene registration, and finally finishing virtual navigation object real scene registration and finishing indoor natural scene path navigation.

Description

Technical field: [0001] The invention relates to augmented reality and indoor navigation technology, and belongs to the field of combining computer vision and augmented reality, in particular to an augmented reality navigation method based on deep learning of indoor natural scene images. Background technique: [0002] Augmented reality (Augment Reality, referred to as AR) is a hot spot in scientific research in recent years, and has a wide range of application prospects. By combining virtual objects with the real environment, people's cognition of the surrounding environment can be enhanced. The characteristics of augmented reality are: combination of virtual and real, real-time interaction and tracking registration. It superimposes computer-generated information (images, models, animations, etc.) on real scenes to achieve seamless integration of virtual and reality; augmented reality is different from VR's fully virtualized real world, but a supplement to the real environme...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06T19/00G06K9/62G06N3/04G06N3/08G01B11/00G01B11/24G01C21/20
CPCG06T19/006G06N3/08G01C21/206G01B11/24G01B11/002G06V20/20G06N3/045G06F18/2135G06F18/24
Inventor 曹兴文吴孟泉陀名熠张文良刘韦韦伯英杰廖宗钰周卉林孙嘉欣张聪颖赵紫琦宁祥雨唐浩晨
Owner LUDONG UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products