Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot positioning precision evaluating method based on monocular vision

A technology for robot positioning and accuracy evaluation, applied in the field of robot vision, can solve the problems of increased complexity, large degree of environmental transformation, and high application cost, and achieves the effect of simple evaluation method, saving use cost, and improving work efficiency

Active Publication Date: 2019-05-17
九天创新(广东)智能科技有限公司
View PDF14 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, the method of arranging the motion acquisition equipment and the calibration board group has high application cost, a large degree of modification to the environment, and increases the complexity of use. Such an evaluation method is difficult to be widely used in practical applications.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot positioning precision evaluating method based on monocular vision
  • Robot positioning precision evaluating method based on monocular vision
  • Robot positioning precision evaluating method based on monocular vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The present invention will be further described below in conjunction with specific embodiments:

[0040] See figure 1 with 2 As shown, the method for evaluating robot positioning accuracy based on monocular vision in this embodiment includes the following steps:

[0041] S1: Arrange the calibration board in the visual range along the robot movement;

[0042] S2: The robot moves to a calibration board in the motion space;

[0043] S3: When the robot approaches the calibration board, record the real-time pose information of the robot's current algorithm and the relative pose information of the robot and the calibration board;

[0044] Among them, in step S3, the real-time pose information of the robot is obtained by a positioning algorithm.

[0045] The relative pose information of the robot and the calibration board is obtained by the camera calibration method. The specific calculation steps are as follows:

[0046] The image plane coordinate system is converted to the image pixel ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robot positioning precision evaluating method based on monocular vision. A calibration plate is arranged at any position in a robot motion space, in the motion process, a robot observes the same calibration plate for many times at the different positions in the working environment of the robot, multiple times of sampling is conducted through an observing method in this way, and according to all observing information collected in the one-time motion process and statistical characteristics of the observing information, quantitative comparison of the positioning precision of different positioning algorithms in the same working environment is realized. According to the robot positioning precision evaluating method, the real position and posture of the robot are obtained without other precision instruments, a large number of repeated experimental tests are not needed, and the working efficiency is improved while the using cost is saved; in addition, the calibrationplate with the unknown environmental position and posture is adopted, thus the evaluating method is simpler and more convenient, and the motion environment and the motion space of the robot are not limited; and finally, the precision of a robot positioning algorithm is described by expression positively correlated with the positioning error, and the positioning precision between the different algorithms is compared.

Description

Technical field [0001] The invention relates to the technical field of robot vision, in particular to a method for evaluating the accuracy of robot positioning based on monocular vision. Background technique [0002] The visual positioning of mobile robots is widely used in all aspects of mobile robot visual navigation, and the positioning accuracy directly affects the navigation capabilities of mobile robots. Therefore, it is very important to evaluate the positioning accuracy of mobile robots. The current common evaluation methods mostly use motion acquisition equipment and add position-related calibration plate groups in the motion environment for comparative analysis. [0003] However, the method of arranging motion acquisition equipment and calibration plate group has high application cost, large degree of modification to the environment, and increased complexity of use. Such evaluation methods are difficult to be widely used in practical applications. The present invention ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): B25J19/00
Inventor 张宏朱蕾陈炜楠何力管贻生
Owner 九天创新(广东)智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products