Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot Vision Guidance Method and Device Based on Integrating Global Vision and Local Vision

A robot vision and robot technology, applied in the field of robot vision, can solve problems such as non-compliance, low detection accuracy, and inability to achieve processing accuracy

Active Publication Date: 2020-11-06
北斗长缨(北京)科技有限公司
View PDF11 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The main purpose of the present invention is to provide a robot vision guidance method and device based on the integration of global vision and local vision, aiming to solve the problem that the detection accuracy of the existing robot vision guidance scheme with a large detection field of view is usually not high, and the processing cannot be achieved. Accuracy requirements, and in order to obtain high-precision spatial positioning information, the robot vision guidance solution needs to set a small detection field of view, so for larger processing targets, block detection is required, so the calculation complexity is very high and requires a lot of Large amount of calculation, long calculation time, resulting in low work efficiency of the overall system, and high performance requirements for software and hardware, it is difficult to achieve real-time processing, which does not meet the needs of the current high-speed industrial production process technical problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot Vision Guidance Method and Device Based on Integrating Global Vision and Local Vision
  • Robot Vision Guidance Method and Device Based on Integrating Global Vision and Local Vision
  • Robot Vision Guidance Method and Device Based on Integrating Global Vision and Local Vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0035] A mobile terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, use of suffixes such as 'module', 'part' or 'unit' for denoting elements is only for facilitating description of the present invention and has no specific meaning by itself. Therefore, "module" and "component" may be used mixedly.

[0036] refer to figure 1 , figure 1 It is a schematic flowchart of the first embodiment of the robot vision guidance method based on the integration of global vision and local vision in the present invention. like figure 1 In the illustrated embodiment, the robot vision guidance method based on blending into global vision and local vision comprises the following steps:

[0037] S10. Collect processing target...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robot vision guidance method and device based on the integration of global vision and local vision, which integrates the global vision system and the local vision system, first realizes the rough positioning of the processing target, and realizes that the processing target is divided into blocks, and Carry out road strength planning, and then combine the high-precision visual inspection system to accurately detect the target, and then guide the robot to implement high-precision and high-efficiency automatic grinding and polishing operations, thus meeting the precision requirements for high-efficiency processing of large-volume processing targets.

Description

technical field [0001] The invention relates to the field of robot vision, in particular to a robot vision guidance method and device based on integrating global vision and local vision. Background technique [0002] Automation equipment (robot system), which is a powerful tool for manufacturing a powerful country, must move towards high speed and intelligence. An important means of intelligent automation equipment is to equip the machine with "eyes" and a "brain" that can cooperate with the eyes. This "eye" can be a monocular camera, a binocular camera, a multi-camera, a 3D scanner, or an RGB-D (RGB+Depth) sensor. The core work of intelligent automation equipment includes: analyzing the image data obtained by this "eye" (such as image recognition), and then guiding the robot system to complete specific processing or assembly operations based on the analysis results. With the advancement of processing technology, the surface of the parts to be processed is becoming more an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16
CPCB25J9/1697B25J9/1664
Inventor 刁世普秦磊郑振兴
Owner 北斗长缨(北京)科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products