Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Binocular full-view visual robot self-positioning method based on SURF algorithm

A technology of autonomous positioning and panoramic vision, which is used in instruments, photo interpretation, photogrammetry/video surveying, etc., and can solve problems such as instability and acceleration of edge feature points.

Active Publication Date: 2017-06-30
HARBIN ENG UNIV
View PDF3 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Aiming at the problem of large number of visual image feature points extraction and matching and unstable edge feature points in the process of mobile robot positioning, an accelerated and feature-robust SURF algorithm is proposed in Bay combined with the SIFT feature point extraction process.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Binocular full-view visual robot self-positioning method based on SURF algorithm
  • Binocular full-view visual robot self-positioning method based on SURF algorithm
  • Binocular full-view visual robot self-positioning method based on SURF algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0067] 1. Set artificial landmarks as prior location information

[0068] Since mobile robots often work in unknown unstructured environments during the execution of tasks, in order to transform unfamiliar environments into familiar ones, road signs are used as prior location information, and according to the SURF feature point extraction algorithm, the environment image library is collected for testing and design Weak interference match signpost. Set up two or more artificial landmarks with different characteristics at equal intervals and equal heights on the side with low occlusion in the working environment of the robot.

[0069] 2. Improved MDGHM-SURF feature matching algorithm

[0070] Using the improved MDGHM-SURF algorithm for landmark feature matching and positioning, sampling the panoramic image mask at equal intervals, and performing pixel coordinate transformation, the (p, q) order of the panoramic image I (i, j) is improved discrete Gaussian-Hermit A moment is de...

specific Embodiment

[0091] Taking landmark A as the research object, the coordinates of the panorama center are G(x 0 ,y 0 ), assuming that the pixel position of landmark A in the panoramic image at time t is A(ρ,θ,t), with A as the center of the circle, a circular detection window with a radius of r is set to perform SURF feature detection and landmark matching positioning, assuming that the robot Rotating in situ with angular velocity ω(t), since the distance between the robot and the landmark remains unchanged, the trajectory of the landmark in the panoramic image is an arc with the center of the panorama G as the center and the distance between the center and the radius ρ. Therefore, the detection area at time t′ rotates into a circular field centered on A′(ρ,θ+ω(t)Δt,t′). Considering the translation speed of the robot, the maximum gradient direction of the pixel change of the landmark in the image is the velocity direction, and when the robot is running at the maximum speed, the fastest pix...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of the mobile robot visual positioning, and particularly relates to a binocular full-view visual robot self-positioning method based on an SURF algorithm. The method comprises the following steps: (1) setting an artificial road sign as prior position information; (2) matching the features by the improved MDGHM-SURF; (3) eliminating the error matching by a matching point gravity center iterative algorithm; (4) ranging by the three-dimensional backlight path imaging of a vertical-base-line binocular full-view system; (5) positioning by the full-view triangle; and (6) forecasting a detection window to accelerate the positioning speed. The method is capable of, in allusion to the indoor self-positioning problem of the mobile robot, through setting the artificial road sign as the prior position information, executing the feature point rapid detection by using the improved MDGHM-SURF algorithm, executing the road sign identification positioning through the feature matching, reducing the matching error by the matching point gravity center iterative algorithm, improving the positioning accuracy of the road sign center, forecasting the road sign detection area by the robot motion state, and improving the positioning rapidity in the movement.

Description

technical field [0001] The invention belongs to the technical field of visual positioning of mobile robots, and in particular relates to a method for autonomous positioning of a binocular panoramic vision robot based on a SURF algorithm. Background technique [0002] The autonomous positioning and navigation ability of a mobile robot is a key factor to measure its intelligence level, and the environmental perception ability is the basis of the intelligence of a mobile robot, which has always been a research hotspot in the field of robotics. The relatively mature indoor positioning method is the magnetic track, but the scope of application is limited; RFID, Bluetooth, WLAN and other signals are easy to attenuate, and the accuracy is poor; ultrasonic waves are easily interfered by obstacles, and the cost of laser radar is too high; motor coded pulse kinematic positioning has cumulative errors , and the panoramic vision sensor has the advantages of wide detection range, large a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C11/00G01C11/04
CPCG01C11/00G01C11/04
Inventor 朱齐丹谢洪乐夏桂华蔡成涛张智吕晓龙
Owner HARBIN ENG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products