Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Monocular natural vision landmark assisted mobile robot positioning method

A mobile robot and positioning method technology, which is applied in the fields of inertial navigation and image processing, and can solve problems such as large amount of calculation, low positioning accuracy, and error accumulation.

Active Publication Date: 2013-09-11
ZHEJIANG UNIV
View PDF3 Cites 79 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The advantage of this type of method is that it does not require any prior information of the environment, but the disadvantage is that the amount of calculation is large, and real-time positioning is difficult, and the positioning accuracy of the environment with few features is low.
[0005] Another type is the fusion of vision and IMU: (1) IMU and SLAM fusion, IMU is used for positioning prediction, and the positioning result is corrected according to the relative position of the image feature point and the robot, but the calculation is large and the real-time performance is poor; the other type is IMU Fusion with visual odometry (VO), using VO to correct IMU cumulative error, but both are local positioning, VO cumulative error will also be introduced into the system
[0006] The above non-GPS positioning methods are theoretically local, and the error will still accumulate as the distance increases

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Monocular natural vision landmark assisted mobile robot positioning method
  • Monocular natural vision landmark assisted mobile robot positioning method
  • Monocular natural vision landmark assisted mobile robot positioning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0057] The experiment uses the pioneer3 robot as a platform for online collection and algorithm testing. The platform is equipped with a PointGrey Bumblebee stereo camera, and only one of the cameras is used in the experiment. There are also two Novatel GPSs and one NV-IMU200 IMU installed on the car. The highest frequency of GPS is 20hz, the camera can collect images up to 10 frames per second, and the frequency of IMU is 100hz. The positioning accuracy of GPS using RTK technology can reach up to 2cm. The experiment uses dual GPS to measure the direction of the camera and the initial heading of the car body when collecting road signs, and the baseline distance is 50cm. The experimental environment is an outdoor grass field, and the Sokia SRX1 total station system is used to accurately locate the position of the car body as the true value of the measurement. The total station (TS) locates the car body by tracking the omnidirectional prism installed on the car body, and the accu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a monocular natural vision landmark assisted mobile robot positioning method. The method comprises the following steps: establishing a natural vision landmark feature library at multiple positions in a navigation environment in advance; matching the acquired monocular image and the vision landmark in the library through a robot in the positioning process by utilizing an inertial navigation system; establishing an online image rapid matching frame based on combination of GIST global features and SURF local features, and meanwhile correcting the vehicle course by combining the motion estimation algorithm based on the monocular vision; finally, effectively fusing the positioning information acquired through vision landmark matching and the positioning information acquired through the inertial navigation system by utilizing Kalman filtering. According to the method, positioning precision and robustness are high under the condition that the global position system (GPS) is limited, and the inertial navigation error caused by noise can be effectively corrected; the operation amount is greatly reduced by employing the monocular vision.

Description

technical field [0001] The invention relates to inertial navigation and image processing methods, in particular to a mobile robot positioning method assisted by monocular natural vision landmarks. Background technique [0002] Traditional robot positioning includes GPS, inertial navigation system, etc. GPS has a wide range of uses and high precision, but is greatly affected by the environment. Inertial navigation system is completely autonomous, with high frequency but large noise impact; and GPS / inertial navigation system fusion positioning is the most important part of today's robot navigation. One of the most commonly used integrated navigation and positioning methods, which can effectively use their respective advantages and learn from each other. However, in practical applications, in many cases, the GPS signal will be blocked and cannot be used, resulting in a rapid decline in the overall positioning accuracy of the system. [0003] In order to solve the above problem...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/00
Inventor 项志宇卢维陈明芽
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products