Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor positioning method and device based on inertial data and visual features

A visual feature and indoor positioning technology, which is applied to measuring devices, instruments, surveying and mapping, and navigation, etc., to achieve the effects of high tracking accuracy, simple production, and simple structure

Active Publication Date: 2017-10-17
青岛海通胜行智能科技有限公司
View PDF6 Cites 70 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In order to solve the above problems, the present invention proposes an indoor positioning method and device based on inertial data and visual features. Compared with pure visual camera tracking technology, it has higher robustness, wider application range, and smoother three-dimensional movement; system structure It is simple and greatly improves the anti-interference ability of the system with a small increase in cost. It is suitable for scenes with few texture areas, high-speed motion, jitter, etc. where pure visual tracking cannot be performed.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor positioning method and device based on inertial data and visual features
  • Indoor positioning method and device based on inertial data and visual features
  • Indoor positioning method and device based on inertial data and visual features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] The technical solutions in the embodiments of the present invention will be clearly and completely described below. Obviously, the described embodiments are only some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0044]The core of the present invention is to provide an indoor positioning method and device based on inertial data and visual features. Compared with pure visual camera tracking technology, it has higher robustness, wider application range, and smoother three-dimensional movement; the system structure is simple, In the case of a small increase in cost, the anti-interference ability of the system is greatly improved, and it is suitable for scenes with few texture areas, high-speed motion, jitter, etc. where pure visual tracking cann...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an indoor positioning technology based on inertial data and visual features and discloses a corresponding implementation device to implement the steps. The indoor positioning technology specifically comprises: (1) multi-sensor data processing: a camera calibration and image feature extraction method; an IMU data modeling and filtering method; (2) multi-sensor coordinate system calibration: a system modeling, relative attitude calibration and relative position and attitude joint calibration method; (3) an indoor positioning and tracking technology fusing the inertial data and the visual features. Compared with an existing traditional single camera tracking technology, the indoor positioning technology disclosed by the invention has the advantages that single camera tracking is a simple assumption based on constant-speed motion; a better prediction can be provided by using the inertial data of an IMU, so that a search area is smaller during feature matching, the matching speed is higher, the tracking results are more accurate, and the camera tracking robustness in the image degradation and un-textured areas is greatly improved.

Description

technical field [0001] The present invention relates to an indoor positioning method and device, in particular to an indoor positioning method and device based on inertial data and visual features. Background technique [0002] In the research of mobile robot-related technologies, navigation technology is its core technology, and it is also a key technology to realize intelligence and autonomous movement. Robot navigation technology based on visual sensors is currently a relatively common navigation technology at home and abroad. As application scenarios become more and more complex, navigation accuracy requirements are getting higher and higher, and it is difficult to meet the requirements of processing speed and system robustness only by relying on visual sensors: in scenes with less texture areas, high-speed motion or shaking causes image blurred scenes cannot Performs effective feature tracking while performing poorly on visual navigation. With the emergence of low-cost...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/16G01C21/20
CPCG01C21/165G01C21/206
Inventor 安洪强孙福斋辛悦吉
Owner 青岛海通胜行智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products