Attitude and position estimation method based on vision and inertia information

An attitude and inertial technology, applied in the field of attitude and position estimation based on visual and inertial information, can solve the problems of small moving speed, difficult zero speed detection, estimation drift, etc., and achieve less computing resources, small amount of calculation, and good robustness sexual effect

Active Publication Date: 2015-04-08
ZHEJIANG UNIV
View PDF4 Cites 27 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the limitation of sensor accuracy, the former method is easy to accumulate errors, resulting in position estimation drift, which is more obvious when the movement speed of the equipment is small, especially zero speed detection is still a problem
The second method is based on visual information, and uses multi-view geometry to restore the position of the visual terminal relative to the scene. This type of method does not have the problem of error accumulation, but limited by visual information, this type of method cannot obtain the scene Mesoscale, which can only estimate the relative orientation, and requires the terminal’s moving speed to be small, otherwise, large calculation deviations and even position loss will occur due to the lack of matching points between image frames
The main advantage of this type of method is to use the inertial sensor to predict the movement of the device and speed up the feature matching of the image, but it cannot greatly improve the final accuracy.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Attitude and position estimation method based on vision and inertia information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] In order to describe the present invention in more detail, the technical solution of the present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.

[0027] The method of the present invention is implemented on a smart phone, and the camera and sensor data on the phone are used to calculate the position and posture, such as figure 1 As shown, the specific implementation is as follows:

[0028] A. Perform joint calibration of the camera and sensor on the device to obtain the internal parameters and distortion parameters of the camera, and at the same time obtain the positional relationship between the sensor and the camera in the world coordinate system, so that the sensor and the camera are aligned in the world coordinate system.

[0029] B. Set the origin of the world coordinate system for the initial position of the camera, set the initial state vector of the extended Kalman filter (EKF) in the initial state, set...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an attitude and position estimation method based on vision and inertia information. According to the method, by fusing vision and inertia information of an image, motion state information is saved and updated by virtue of an expanded kalman filter so as to calculate the current accurate attitude and position of equipment, and the motion trail of the equipment within a certain duration of time can be obtained. According to the estimation method disclosed by the invention, the information of the image and a sensor can be flexibly utilized so as to achieve good robustness through mutual complementation of the information, and a situation of tracking losing is avoided. The expanded kalman filter adopted by the method is used for calculating based on an iterative form, and compared with a method of batched calculation, the estimation method can be used for calculating without acquiring all observed quantities, is relatively low in calculated quantity, adaptive to a condition of relatively few calculation resources at terminal equipment and capable of preferably achieving a requirement on real-time performance.

Description

Technical field [0001] The invention belongs to the technical field of device posture and position tracking, and specifically relates to a posture and position estimation method based on vision and inertial information. Background technique [0002] At present, there are two main positioning methods commonly used on terminal equipment: one is to use inertial units, that is, to obtain the device position through the data of gyroscope, gravity accelerometer, and accelerometer; the other is to process the visual information collected by the camera. The posture information of the device. Due to the limitation of sensor accuracy, the former method is prone to accumulate errors and cause the estimated drift of the position, which is more obvious when the movement speed of the device is small, especially the zero speed detection is still a difficult problem. The second method is based on visual information and uses the multi-view geometry method to recover the position of the visual te...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01C21/20
CPCG01C21/165G01C21/20
Inventor 林城王梁昊李东晓张明
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products