Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Inertia/visual integrated navigation method adopting iterated extended Kalman filter and neural network

An extended Kalman and neural network technology, which is applied in the field of integrated navigation in complex environments, can solve the problems of inertial/visual integrated navigation system accuracy decline, inertial navigation system cannot provide long-term high-precision navigation, etc., and achieve the effect of improving the scope of application

Inactive Publication Date: 2014-08-13
SOUTHEAST UNIV
View PDF6 Cites 48 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to solve the shortcoming that the inertial navigation system cannot provide high-precision navigation for a long time when the mobile robot is moving in a low-light or no-light environment, which leads to a serious decrease in the accuracy of the inertial / visual combined navigation system, the present invention proposes a method that uses iterative expansion of Karl Inertial / visual integrated navigation method based on Mann filter and neural network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Inertia/visual integrated navigation method adopting iterated extended Kalman filter and neural network
  • Inertia/visual integrated navigation method adopting iterated extended Kalman filter and neural network
  • Inertia/visual integrated navigation method adopting iterated extended Kalman filter and neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present invention will be further explained below in conjunction with the accompanying drawings.

[0028] An inertial / visual integrated navigation method using iterative extended Kalman filter and neural network includes the following steps:

[0029] Step 1: If figure 1 As shown, when the visual signal is good, the vehicle-mounted camera is placed vertically downward to obtain the dynamic video of the road surface passed by the mobile robot during its movement, and the SURF algorithm is used to extract the SURF feature points in two adjacent image frames in the video, and Record the position coordinates of the feature points in the image coordinate system and match the SURF feature points on the two frames of images according to the nearest neighbor matching method to determine the speed V of the camera on the horizontal plane x , V y ;

[0030] Let the vertical distance between the projection center of the vehicle camera and the ground be Z R , the normalized f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an inertia / visual integrated navigation method adopting iterated extended Kalman filter and a neural network, belonging to the technical field of integrated navigation in a complicated environment. The method comprises the steps of when a visible signal is valid, acquiring a dynamic video by utilizing a camera carried by a mobile robot, and determining the speed of the camera by an image characteristic extraction method and a nearest neighbor matching method; optimally estimating the speed and the acceleration of the mobile robot by using the iterated extended Kalman filter; establishing a navigation speed error model of an inertial navigation system by utilizing the neural network; when the visible signal is in loss of lock, compensating the speed error of the navigation system by virtue of the neural network error model which is previously obtained by training. According to the method, the problem that the inertia / visual integrated navigation system can not provide lasting high-precision navigation when the visible signal is in loss of lock can be solved; the method can be applied to long-endurance, long-distance and high-accuracy navigation and location for the mobile robot in the complicated environment with weak light, no light or the like.

Description

technical field [0001] The invention relates to an inertial / visual combined navigation method using iterative extended Kalman filter and neural network, and belongs to the technical field of combined navigation in complex environments. Background technique [0002] In recent years, with the rapid development of computer technology, electronic technology, communication technology, advanced control and artificial intelligence, the research and application of mobile robot technology have made great progress. As a complex comprehensive system integrating environmental perception, dynamic decision-making, and real-time behavior control and execution, intelligent mobile robots have been more and more widely used in military, civilian, scientific research, and industrial production. Instead of humans to perform some work that needs to be carried out under harsh or dangerous conditions. Positioning and navigation, as the primary prerequisite for indoor mobile robots to complete tas...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/00G01C21/16
CPCG01C21/20G01C21/165
Inventor 陈熙源高金鹏李庆华徐元方琳赖泊能
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products