Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Monocular vision inertial combination positioning navigation method

An inertial combination and monocular vision technology, applied in the field of positioning and navigation, can solve problems such as uncertain scale, susceptible to environmental conditions, and divergent accuracy

Inactive Publication Date: 2020-01-17
北京维盛泰科科技有限公司
View PDF6 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Monocular cameras are low in price, low in power consumption, and can provide rich environmental information, but the pure monocular visual navigation method has the disadvantages of uncertain scale and is easily affected by environmental conditions; inertial sensors provide self-motion information, and are not easily affected by occlusion and light However, the pure inertial navigation method is affected by its own drift and accumulated errors, and it only has good accuracy near the initialization time. As time increases, its accuracy will gradually diverge

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Monocular vision inertial combination positioning navigation method
  • Monocular vision inertial combination positioning navigation method
  • Monocular vision inertial combination positioning navigation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065] In order to make the object, technical solution and advantages of the present invention clearer, the technical solution of the present invention will be clearly and completely described below in conjunction with specific embodiments of the present invention. Apparently, the described embodiments are only some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0066] A specific embodiment of the present invention, such as Figure 1-4 As shown, the present invention provides a monocular visual-inertial combination positioning and navigation method, including steps 1 to 6.

[0067] Step 1. Collect the video stream through the monocular camera; collect the IMU data stream through the IMU.

[0068] Step 2: Align the collected video stream and IMU da...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a monocular vision inertial combination positioning navigation method. The method comprises the following steps: acquiring a video stream and an IMU data stream, and packaging and aligning the video stream and the IMU data stream; initializing the video stream and the IMU data stream, wherein the initializing process comprises the following steps: initializing vision, initializing an IMU, determining a conversion relationship between a vision coordinate system and an IMU world coordinate system, carrying out nonlinear optimization to determine a scale initial value, andcarrying out refinement estimation on the scale by using lambda I EKF; acquiring inertial navigation data in the IMU data stream, obtaining an IMU pose through a complementary filtering and pre-integration combined technology, tracking image characteristics in the video stream, and obtaining a vision pose by referring to the IMU pose variation; and determining whether the vision tracking is lostor not, carrying out motion tracking by using the IMU pose if the vision tracking is lost, and fusing the vision pose and the IMU pose through an IDSF technology if the vision tracking is not lost toobtain a final camera pose. The method aims at the disadvantages in the prior art, improves the scale precision, and achieves the effects of high precision and high robustness in the positioning process.

Description

technical field [0001] The invention relates to the technical field of positioning and navigation, in particular to a positioning and navigation method based on monocular visual-inertial combination. Background technique [0002] In recent years, with the development of emerging technologies such as self-driving cars, augmented reality, virtual reality, robot and drone navigation, integrated navigation has gradually become a popular research direction in the field of positioning and navigation. Among different navigation sensor combination modes, monocular visual-inertial combination provides a cheap and potential solution. Monocular cameras are low in price, low in power consumption, and can provide rich environmental information, but the pure monocular visual navigation method has the disadvantages of uncertain scale and is easily affected by environmental conditions; inertial sensors provide self-motion information, and are not easily affected by occlusion and light Howe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/16G01C21/00
CPCG01C21/005G01C21/165
Inventor 康健王志军冯辰于艺晗
Owner 北京维盛泰科科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products