Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Scene three-dimensional data registration method and navigation system error correction method

A technology of three-dimensional data and scenes, applied in the field of navigation systems, can solve the problems of accumulation of errors and increase of positioning errors, so as to ensure the success rate, improve the accuracy, and eliminate the effect of data mismatch.

Inactive Publication Date: 2017-05-24
苏州中德睿博智能科技有限公司
View PDF13 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Navigation systems with inertial sensors can output or provide accurate short-term position, velocity, acceleration, attitude, and angular rate estimates, but errors accumulate over time and their positioning errors grow over time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Scene three-dimensional data registration method and navigation system error correction method
  • Scene three-dimensional data registration method and navigation system error correction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0036] Refer to 1, this embodiment shows a method for scene 3D data registration:

[0037] Including the following steps:

[0038] 1): Inertial sensor and visual sensor for spatial calibration and time synchronization;

[0039] 2): The inertial sensor outputs the pose information at two adjacent moments, and then transmits the pose information to the visual sensor, and the visual sensor performs data registration on two adjacent frames of data based on the pose information given by the inertial sensor;

[0040] 3): Calculate the attitude change of the visual sensor through the data registration of the visual sensor;

[0041] 4): Real-time correction of the inertial sensor by using the attitude change of the visual sensor;

[0042] 5): After data registration by the visual sensor, the data is registered to the 3D map.

[0043] Visual sensor RGBD camera composition;

[0044] In step 2), the working steps of the visual sensor are as follows:

[0045] 2a) The RGBD camera capt...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a scene three-dimensional data registration method. The method comprises the following steps: an inertial sensor and a visual sensor perform spatial calibration and time synchronization; the inertial sensor outputs pose information at two adjacent moments, and then transmits the pose information to the visual sensor, and the visual sensor performs data registration on two adjacent frames of data according to the pose information given by the inertial sensor; the pose variation of the visual sensor is worked out through the data registration of the visual sensor; the inertial sensor is subjected to real-time correction through the pose variation of the visual sensor; fifthly, data is registered to a three-dimensional map after data registration performed through the visual sensor. The scene three-dimensional data registration method and the navigation system error correction method perform registration on the inertial sensor through the visual sensor, solve vagueness during motion estimation performed by a single visual sensor or inertial sensor, improve the motion object detection performance, reduce the disadvantage that the error of the inertial sensor is accumulated with passage of time, and improve the precision of the inertial sensor.

Description

technical field [0001] The present invention relates to navigation systems, especially the technical field of navigation system error correction. Specifically, it shows a registration method based on three-dimensional scene data, and at the same time, it also shows a navigation system error correction method based on three-dimensional data registration method. Calibration method. Background technique [0002] Inertial sensors, as motion sensors with strong autonomy, have been widely used in motion measurement, navigation and positioning, etc. Among them, the inertial sensors used for navigation and positioning can be used to provide all navigation parameters such as position, velocity and attitude. [0003] In a navigation system with an inertial sensor, the performance of the inertial sensor is determined by sensor noise, deviation, scaling factor, and system calibration; the initialization estimation accuracy of the inertial sensor is high, when the inertial measurement da...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/16G01C21/00G01C25/00
CPCG01C21/005G01C21/165G01C25/005
Inventor 孙波曾雅丹肖军浩
Owner 苏州中德睿博智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products