Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual navigation method based on SIFT (scale invariant feature transform) algorithm

A visual navigation and algorithm technology, applied in the field of visual navigation based on the SIFT algorithm, can solve problems such as unsatisfactory computer real-time processing speed, difficulty in applying small and medium-sized robots, and large errors

Inactive Publication Date: 2014-03-19
SHANGHAI DIANJI UNIV
View PDF4 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] ⑴In visual navigation and positioning, image processing has a large amount of calculation, and the real-time processing speed of the computer cannot meet the needs in reality
[0007] (2) High-precision positioning and navigation technology is expensive and difficult to apply to small and medium-sized robots, while low-cost technology has low precision and large errors, which seriously affect the navigation effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual navigation method based on SIFT (scale invariant feature transform) algorithm
  • Visual navigation method based on SIFT (scale invariant feature transform) algorithm
  • Visual navigation method based on SIFT (scale invariant feature transform) algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] In order to make the content of the present invention clearer and easier to understand, the content of the present invention will be described in detail below in conjunction with specific embodiments and accompanying drawings.

[0031] The present invention overcomes the deficiencies of the above-mentioned methods in the prior art, uses the SIFT algorithm to realize feature extraction and tracking of natural landmarks, determines the position of the robot according to the position information of natural landmarks in the image sequence, and adopts dynamic extended Kalman filtering and inertial navigation system parameters In order to realize the correction of the error of the inertial navigation system by visual navigation information, obtain accurate navigation coordinates, and at the same time, achieve the characteristics of miniaturization and low cost, so as to improve the practicability in small and medium-sized robots.

[0032] Specific embodiments of the present in...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a visual navigation method based on an SIFT (scale invariant feature transform) algorithm. The visual navigation method comprises the steps of 1, acquiring an image through a navigation camera of a robot; 2, extracting feature points of a natural landmark on the image acquired by the navigation camera by virtue of an SIFT method; and 3, correcting a measurement error of an inertial navigation system of the robot by using a dynamic extended Kalman filter and combining visual information according to the extracted feature points of the natural landmark, and estimating a running state of the robot and positions of visual features to obtain navigation parameters of the robot.

Description

technical field [0001] The present invention relates to the field of robot visual navigation, more specifically, the present invention relates to a visual navigation method based on SIFT algorithm. Background technique [0002] At present, the positioning sensors commonly used by mobile robots include odometer, laser radar, ultrasonic, infrared, microwave radar, gyroscope, compass, speed or accelerometer, tactile or proximity sensor, etc. [0003] On the other hand, the positioning technology commonly used by robots mainly includes two types: absolute positioning and relative positioning technology. Absolute positioning uses navigation beacons, active or passive identification, map matching or global positioning (GPS) for positioning, which has high positioning accuracy, but the cost is high, and it is not suitable for small robots. [0004] Relative positioning is to determine the current position of the robot by measuring the distance and direction of the robot relative t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/00
CPCG01C21/165G01C21/20G01C25/005
Inventor 崔显龙朱旭红张哲栋王海军
Owner SHANGHAI DIANJI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products