Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Vision/Inertial Integrated Navigation Method Based on Online Calibration of Camera Intrinsic Parameters

A navigation method and inertial combination technology, which is applied in directions such as navigation, navigation, mapping and navigation through speed/acceleration measurement, can solve problems such as the reduction of the precision of the navigation system, and achieve the effect of solving the reduction of precision.

Active Publication Date: 2021-11-30
NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
View PDF12 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to solve the technical problems mentioned in the above-mentioned background technology, the present invention proposes a visual / inertial integrated navigation method based on online calibration of camera internal parameters to solve the problem of reduced accuracy of the navigation system caused by changes in camera internal parameters

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Vision/Inertial Integrated Navigation Method Based on Online Calibration of Camera Intrinsic Parameters
  • A Vision/Inertial Integrated Navigation Method Based on Online Calibration of Camera Intrinsic Parameters
  • A Vision/Inertial Integrated Navigation Method Based on Online Calibration of Camera Intrinsic Parameters

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] The technical solutions of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0059] The present invention designs a visual / inertial integrated navigation method based on camera internal reference online calibration, such as figure 1 As shown, the steps are as follows:

[0060] Step 1: Collect visual sensor data S(k) and accelerometer data at time k and gyroscope data

[0061] Step 2: use the visual sensor data S(k) to perform feature matching and detection between two adjacent image frames;

[0062] Step 3: Leverage Inertial Sensor Data and Perform pre-integration between adjacent two image frames;

[0063] Step 4: Combine visual reprojection error and inertial pre-integration error to optimize and solve carrier navigation information and camera internal parameters;

[0064] Step 5: Output carrier navigation information and camera internal parameters, and return to step 1.

[0065] In this embodiment, t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a visual / inertial combined navigation method based on camera internal reference online calibration, the steps are: collecting visual sensor data S(k), accelerometer data and gyroscope data at time k and using the visual sensor data S(k) to perform adjacent Feature matching and detection between two image frames; use inertial sensor data and pre-integrate between two adjacent image frames; combine visual reprojection error and inertial pre-integration error to optimize and solve carrier navigation information and camera internal parameters; output carrier navigation information and camera intrinsics. The invention can complete the calibration of the internal parameters of the camera under the visual / inertial navigation framework, and can effectively solve the problem of the decrease of the precision of the navigation system caused by the change of the internal parameters of the camera.

Description

technical field [0001] The invention belongs to the field of robot navigation, and in particular relates to a visual / inertial combined navigation method. Background technique [0002] Vision / inertial integrated navigation system has become a research hotspot in the field of robot autonomous navigation because of its good robustness. The visual sensor can suppress the drift problem of the inertial sensor, and the inertial sensor can make up for the problem that the visual sensor cannot work in the environment with less texture and insufficient light. Therefore, the visual / inertial integrated navigation system has broad development prospects. [0003] Most of the current visual / inertial integrated navigation methods use the internal camera parameters as fixed parameters during operation, and obtain the internal parameters of the camera through traditional offline calibration methods. But in practice, the internal parameters of the camera may change due to mechanical shock or ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01C21/16G01C21/00
CPCG01C21/00G01C21/165
Inventor 杨子寒赖际舟吕品刘建业袁诚
Owner NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products