Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Combination Control Method Based on Object Image Feature Point Pixel Spatial Position

A technology of spatial position and combined control, applied in image analysis, image data processing, program-controlled manipulators, etc., can solve the problems of low control information accuracy, divergence of action errors, and accuracy loss of lock, etc., to improve the ability of self-compensation adjustment and compensation. The effect of adjusting accuracy, suppressing action errors, and improving reliability

Active Publication Date: 2019-12-03
LIYUAN HYDRAULIC SYST GUIYANG
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] The existing 7-axis serial robot is generally composed of a 6-axis robot and a slide rail (gantry); the 6-axis robot has its own control, and the control of the 6-axis robot is coordinated with the control of the slide rail (gantry) action motor. There will be precision loss of lock and return error, and the error will increase with time and round-trip accumulation, resulting in serious divergence of action error, which in turn leads to low accuracy of control information
Therefore, the existing 7-axis series robot has the problems of serious divergence of motion errors and low accuracy of control information.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Combination Control Method Based on Object Image Feature Point Pixel Spatial Position
  • Combination Control Method Based on Object Image Feature Point Pixel Spatial Position
  • Combination Control Method Based on Object Image Feature Point Pixel Spatial Position

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0031] Example one. Combination control method based on the pixel spatial position of the feature point of the object image, such as figure 1 As shown, including the following steps:

[0032] a. Install a 2D vision system camera so that the x-axis, y-axis, and z-axis of the coordinate system of the 2D vision system camera coincide with the front, back, left, and right, up and down coordinate axes of the robot system;

[0033] b. Use the 2D vision system camera to collect the image of the object features captured by the robot's end axis. 0 The image collected at any moment is I 0 , In T n The image collected at any moment is I n ;

[0034] c. Use ORB feature point detection method to detect I 0 The coordinate set of the feature point in P 0 ;

[0035] d, I 0 , P 0 , I n As the input, I get I through the LK optical flow pyramid method 0 The characteristic points in I n Set of coordinates in

[0036] e, I n , I 0 As the input, the feature point set P is obtained by the LK optical flow ...

Embodiment 2

[0052] Example two. Combination control method based on the pixel spatial position of the feature point of the object image, such as figure 1 As shown, including the following steps:

[0053] a. Install a 2D vision system camera so that the x-axis, y-axis, and z-axis of the coordinate system of the 2D vision system camera coincide with the front, back, left, and right, up and down coordinate axes of the robot system;

[0054] b. Use the 2D vision system camera to collect the image of the object features captured by the robot's end axis. 0 The image collected at the moment is I 0 , In T n The image collected at any moment is I n ;

[0055] c. Use ORB feature point detection method to detect I 0 The coordinate set of the feature point in P 0 ;

[0056] d, I 0 , P 0 , I n As the input, I get I through the LK optical flow pyramid method 0 The characteristic points in I n Set of coordinates in

[0057] e, I n , I 0 As the input, the feature point set P is obtained by the LK optical flow ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a pixel space position combination control method based on object image characteristic points. The pixel space position combination control method comprises steps that a 2D visual sense system camera is used for image acquisition of characteristics of objects grabbed by an end shaft of a robot, and an ORB characteristic point detection method is used to measure the coordinate set of the characteristic points of the image, and then geometric distances between the corresponding points of the coordinate set are calculated, and then the points, the distances between which are greater than a mean value, are eliminated; the speeds of the objects in a horizontal direction and the speeds and the distances of the characteristic points in an image coordinate system are calculated by using a robot space motion coordinate system, and Lk optical flow is used to calculate the angle errors of the optical flows of the various characteristic points, and then characteristic point displacements solved by using robot motion control information are used as the ideal characteristic values of the optical flows, and the points having the larger angle errors are eliminated; the motion speed errors of the characteristic points in a pixel coordinate system are calculated to be used as control quantities for control. The motion errors of the robot are inhibited, and control information precision is improved.

Description

Technical field [0001] The invention relates to a 7-axis tandem robot (6-axis robot + sliding rail (gantry)) position control method, in particular to a combined control method based on the spatial position of the feature points of the object image. Background technique [0002] The existing 7-axis tandem robots are generally composed of a 6-axis robot and a sliding rail (gantry); among them, the 6-axis robot has its own control, and the control of the 6-axis robot is coordinated with the motion motor control of the slide rail (gantry) There will be a loss of precision and a return error at the time, and the error will accumulate and increase with time and round trips, resulting in serious divergence of the action error, and lower accuracy of the control information. Therefore, the existing 7-axis serial robot control has the problems of severe divergence of action errors and low accuracy of control information. Summary of the invention [0003] The purpose of the present inventi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/00G06T7/246B25J9/16
CPCB25J9/1664
Inventor 黄志坚
Owner LIYUAN HYDRAULIC SYST GUIYANG
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products