Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for estimating relative posture of robot

A relative posture and robot technology, applied in the field of robot vision, can solve the problems of poor aesthetics and large occupied area, and achieve low cost requirements and simple effects

Inactive Publication Date: 2017-06-06
SHENYANG SIASUN ROBOT & AUTOMATION +1
View PDF4 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Custom signs are relatively rare in daily life, and are less aesthetically pleasing in indoor or other scenes
In addition to aesthetic problems, the use of multiple QR codes also occupies a large area

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for estimating relative posture of robot
  • Method for estimating relative posture of robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0020] S11: Set the two-dimensional code identification.

[0021] S12: Perform adaptive binarization processing on the collected scene images containing the two-dimensional code logo, locate the designated positions on all detection graphics in the two-dimensional code and the centers of the correction graphics 3, and establish a binary system according to their positional relationship. The homography relationship matrix between the dimension code plane and the camera plane.

[0022] S13: According to the calibrated internal parameter matrix of the camera, the external parameter matrix is ​​solved, and the external parameter matrix is ​​transformed to obtain the relative pose relationship between the two-dimensional code plane and the camera plane.

Embodiment 2

[0024] The following is attached with the manual figure 2 The relative pose estimation method of the robot is described in detail. A method for estimating a relative pose of a robot, comprising the steps of:

[0025] S21: Set the two-dimensional code identification.

[0026] The size range of the two-dimensional code mark should preferably be between 5-10 cm. The number of correction graphics 3 is not less than one. In the scene, try to ensure that the plane of the QR code is perpendicular to the horizontal plane, and one side of the QR code is parallel to the horizontal plane as much as possible. Secondly, ensure that the height of the center point of the QR code mark is as consistent as possible with the center position of the monocular camera of the robot or that the height difference does not exceed 5cm.

[0027] S22: Using a near-infrared camera with an infrared supplementary light to collect an image with a two-dimensional code mark.

[0028] When the robot moves c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of robot vision, and especially relates to a method for estimating the relative posture of a robot. The method specifically comprises the steps: setting a two-dimensional code identification of a specified type in a scene; carrying out the adaptive binarization processing of a scene image which is collected by a monocular near-infrared camera and comprises the two-dimensional code identification; locating specified positions of all detection graph and the center of all correction graph in the two-dimensional code identification as key points, and building a homography relation matarix according to the positions and relation of the key points; solving an external parameter matrix according to the set internal parameter matrix of the camera; finally carrying out the transformation of the external parameter matrix, and obtaining the relative posture relation between a two-dimensional code plane and the plane of the camera.

Description

technical field [0001] The invention relates to the technical field of robot vision, in particular to a method for estimating a relative pose of a robot. Background technique [0002] With the development of mobile robot technology, mobile robots have the ability to perform autonomous positioning without knowing the starting point and the map. Research on autonomous mobile robots capable of achieving SLAM (Simultaneous Localization and Mapping) capabilities is also a major trend in technology development. At present, the more popular and widely used SLAM method is to apply the redundant geometric entity position information provided by monocular vision, and fuse it with the position information detected by lidar at the observation level, and use the extended Kalman filter integration mechanism to realize mobile robot accurate positioning. However, for different purposes, it is sometimes necessary to obtain higher-precision pose information. For example, to realize automat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/207G06T7/70
CPCG06T7/20G06T2207/10048
Inventor 王宏玉徐方曲道奎宋健李邦宇刘晓帆
Owner SHENYANG SIASUN ROBOT & AUTOMATION
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products