Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual odometer construction method in robot visual navigation

A visual odometry and robot vision technology, which is applied in the field of visual odometry construction in robot visual navigation, can solve the problems of feature point mismatch, poor robustness and accuracy, and affect positioning accuracy and mapping accuracy.

Pending Publication Date: 2021-01-05
南京师范大学镇江创新发展研究院 +1
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Traditional visual odometry has poor robustness and accuracy in dynamic environments. Moving objects in the scene will cause mis-matching of feature points in pose estimation, which in turn affects positioning accuracy and accuracy of mapping. It is affected by moving objects in the external environment. The interference of feature points is large

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual odometer construction method in robot visual navigation
  • Visual odometer construction method in robot visual navigation
  • Visual odometer construction method in robot visual navigation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0068] Such as figure 1 As shown, the present invention discloses a method for constructing a visual odometer in robot visual navigation, comprising the following steps:

[0069] Step 1, read the original image, extract the feature points and obtain the position of the feature points through the double threshold method;

[0070] Step 2, perform target detection on the original image through the YOLOv3 algorithm, and obtain the position of the bounding box of the object; image 3 Shown is the target detection result map, which respectively frame the bounding boxes of tricycles, pedestrians, and cars.

[0071] Step 3, according to the position of the feature point and the position of the object bounding box, judge whether the feature point is in the object bounding box, remove the feature points not in the object bounding box, and obtain the feature points in the object bounding box;

[0072] Step 4, performing optical flow tracking on the feature points in the bounding box of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a visual odometer construction method in robot visual navigation. The visual odometer construction method comprises the following steps: extracting image feature points; obtaining the position of an object bounding box; judging whether the extracted feature points are in the detected object bounding box or not, and if not, removing the feature points; if the number of the dynamic points in the object bounding box is within the object bounding box, tracking the feature points in the object bounding box through optical flow, judging the dynamic points through an optical flow epipolar method, and if the number of the dynamic points in the object bounding box is greater than a threshold value, considering that the object in the object bounding box is moving, and removingall the feature points in the object bounding box. If the threshold value is not exceeded, it is considered that the object in the object bounding box is static, wherein the feature points in the object bounding box are effective feature points; estimating a camera motion track according to the effective feature points, recovering the rotation and translation motion of the camera, estimating thepose of the camera, and completing the construction of the visual odometer. According to the invention, the problem of inaccurate pose estimation in a dynamic environment is solved, and the positioning precision is improved.

Description

technical field [0001] The invention relates to the technical field of visual navigation and image processing, in particular to a method for constructing a visual odometer in robot visual navigation. Background technique [0002] After the robot enters the unknown environment, the positioning and navigation functions are inseparable from the visual odometer. The visual odometer refers to the processing and analysis of continuous video sequence frame images through machine vision technology to complete the position and attitude estimation of the mobile camera and realize the navigation and positioning functions. Traditional visual odometry has poor robustness and accuracy in dynamic environments. Moving objects in the scene will cause mis-matching of feature points in pose estimation, which in turn affects positioning accuracy and accuracy of mapping. It is affected by moving objects in the external environment. The interference of feature points is relatively large. Conten...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06T7/73G06T7/80G01C22/00
CPCG06T7/246G06T7/73G06T7/80G01C22/00
Inventor 谢非郭钊利刘益剑陆飞梅一剑何逸卢毅吴俊章悦汪璠
Owner 南京师范大学镇江创新发展研究院
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products