Target identification and capture positioning method based on deep learning

A deep learning and target recognition technology, applied in the field of machine vision, can solve problems such as poor generalization ability and robustness

Active Publication Date: 2018-10-12
BEIJING UNIV OF TECH
View PDF3 Cites 70 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The image processing of the traditional manipulator recognition stage uses the method of feature extraction to process image information. The process of fe...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target identification and capture positioning method based on deep learning
  • Target identification and capture positioning method based on deep learning
  • Target identification and capture positioning method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0057] In order to solve the problems existing in the above-mentioned traditional vision algorithms, a target recognition and grasping positioning method based on deep learning is proposed. First, use the Kinect camera to collect the depth and color images of the scene, use the Faster RCNN deep learning algorithm to identify the scene target, select the captured target area according to the recognized category, and use it as the input of the GrabCut image segmentation algorithm to extract the outline of the target, and then obtain The specific position of the target, and then use the position information as the input of the cascade neural network to detect the optimal grasping position, and finally obtain the grasping position and posture of the manipulator. The overall flow of the method involved is attached figure 1 As shown, the specific im...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a target identification and capture positioning method based on deep learning, and belongs to the field of machine vision. The method comprises the following steps that: firstly, utilizing a Kinect camera to collect the depth and the color image of a scene; then, using a Faster R-CNN (Regions with Convolutional Neural Network features) deep learning algorithm to identify ascene target; according to an identified category, selecting a captured target area as the input of a GrabCut image segmentation algorithm; through image segmentation, obtaining the outline of the target so as to obtain the specific position of the target as the input of a cascade neural network for carrying out optimal capture position detection; and finally, obtaining the capture position and the capture gesture of a mechanical arm. Through the method, the instantaneity, the accuracy and the intelligence of target identification and positioning can be improved.

Description

technical field [0001] The invention belongs to the field of machine vision. A target recognition and grasping positioning method based on deep learning is proposed to improve the real-time performance, accuracy and intelligence of target recognition and positioning. Background technique [0002] With the development of industrial automation technology, the number of robots is increasing, and industrial robots have been widely used in all aspects of life such as automobile manufacturing, machining, electrical and electronic, and smart home services. The development of robot technology reflects the development level of a country's automation to a certain extent. With the development of social economy, the expansion of production scale and the complexity of production environment, the development and production of more intelligent, information-based and high-precision automation systems Appears to be particularly important. In the field of machine vision, target recognition ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/73G06T7/11G06T7/194G06K9/62
CPCG06T7/11G06T7/194G06T7/74G06T2207/10024G06T2207/20081G06T2207/20084G06T2207/30164G06F18/23213G06F18/2411
Inventor 贾松敏鞠增跃张国梁李秀智张祥银
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products