Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot space positioning and grabbing control method based on template matching

A space positioning and template matching technology, applied in the field of visual positioning, can solve the problems of height deviation and inability to achieve precise positioning, and achieve the effect of flexible grasping

Pending Publication Date: 2021-08-10
HANGZHOU ZHONGWEI PHOTOELECTRICITY
View PDF7 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It solves the problem that the existing technology cannot realize accurate positioning when the material has height deviation and inclination

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot space positioning and grabbing control method based on template matching
  • Robot space positioning and grabbing control method based on template matching
  • Robot space positioning and grabbing control method based on template matching

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0071] like figure 1 Shown is the robot space positioning and grasping control method based on template matching in the present invention, the method includes four processes of calibration, template creation, relative pose calculation, pose measurement, and positioning and grasping, such as figure 2 As shown in the figure, it is a flow chart of the pose evaluation algorithm of the specially-made logo in the camera coordinate system in the method of the present invention.

[0072] The robot spatial positioning and grabbing control method based on template matching in the present invention specifically includes the following steps:

[0073] S01 calibration: collect multiple calibration board images and corresponding robot end pose information to calibrate the visual positioning control system, and obtain the camera internal parameters and hand-eye relationship matrix

[0074] The specific steps in step S01 are:

[0075] S01-1 Install the monocular CCD camera at the end of t...

Embodiment 2

[0128] like image 3 As shown, the special logo designed by the present invention based on the template matching robot space positioning and grasping control method is a square, surrounded by black borders, and the interior is a black pattern, which can represent the material tilt information. This special logo is only a special logo and Can be transformed as needed.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robot space positioning and grabbing control method based on template matching. The method comprises the steps that multiple calibration plate images and corresponding robot tail end pose information are collected, and a visual positioning control system is calibrated; a special identification image is collected to create a to-be-matched template; a relative pose between a grabbing position and a special identifier is calculated; a robot vision positioning control system processes the collected material image; and the robot obtains a relative pose according to a pose vector of the special identifier under a robot coordinate system and a pose of the tail end of the robot during teaching grabbing, and space positioning and grabbing of a material is achieved. According to the method, accurate positioning and grabbing under the condition that the material has height deviation and inclination can be achieved, the problem of space uncertainty such as inclination and rotation of the material in the grabbing process is solved, and the method has the advantages of being flexible in calibration and small in calibration frequency.

Description

technical field [0001] The invention relates to the technical field of visual positioning, in particular to a template matching-based robot spatial positioning and grasping control method. Background technique [0002] Today, robots are widely used in industrial fields. The introduction of robots can not only reduce manual workload, but also greatly improve the automation level of factories and reduce costs. One of the key applications of robots is the positioning and grabbing of materials. Common scenarios are based on teaching and offline programming. It is necessary to ensure that the relative position between the robot and the material is fixed. When the position of the material changes, it is necessary to re-teach or program, which is not flexible enough. Based on vision technology, this situation can be improved. The existing material positioning and grasping is mostly plane positioning, which requires that the camera imaging plane is always parallel to the plane whe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16
CPCB25J9/1697B25J9/1602
Inventor 刘华赫海斌董佳颖陈双叶欣何守龙
Owner HANGZHOU ZHONGWEI PHOTOELECTRICITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products