Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Object grabbing method of robot

A robot and robot movement technology, applied in the direction of manipulators, manufacturing tools, etc., can solve the problems of being unsuitable for dark environments and low positioning accuracy, and achieve the effects of accurately grasping objects, improving accuracy, and expanding the scene and scope

Inactive Publication Date: 2017-09-05
WUHAN UNIV OF SCI & TECH
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] An embodiment of the present invention provides a method for a robot to grab an object, which is used to solve the technical problems of low positioning accuracy and unsuitability for low-light environments in the existing method of using the robot's own vision for positioning.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Object grabbing method of robot
  • Object grabbing method of robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0046] This embodiment provides a method for a robot to grab an item, the method comprising:

[0047] Step S101: Obtaining the first position of the robot and the second position of the item, wherein the first position and the second position come from a Kinect infrared device;

[0048] Step S102: According to the first position and the second position, obtain a first movement trajectory from the robot to the object;

[0049] Step S103: Obtain a target position according to the first movement trajectory, so that the robot moves to the target position to grab the object.

[0050]In the above method, obtain the first position of the robot and the second position of the article by the Kinect infrared device, and send the above-mentioned first position and the second position to the robot, and establish a data connection between the Kinect infrared device and the robot, and can For interaction, Kinect infrared device acquisition can obtain accurate location information, which can...

Embodiment 2

[0077] Based on the same inventive concept as in Embodiment 1, Embodiment 2 of the present invention also provides a method for a robot to grab an item from the perspective of a Kinect infrared device, the method comprising:

[0078] Step S201: the Kinect infrared device acquires the first position of the robot and the second position of the item;

[0079] Step S202: Sending the first position and the second position to the robot, so that the robot obtains a target position according to the first position and the second position so as to grab the object.

[0080] Specifically, in the method for grabbing an item provided in an embodiment of the present invention, the Kinect infrared device acquires the first position of the robot and the second position of the item, including:

[0081] Obtain a spatial depth image;

[0082] Obtain pixel coordinates and depth coordinates according to the spatial depth image;

[0083] Obtain the first position of the robot and the second positi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an object grabbing method of a robot. The method includes the steps that a first position of the robot and a second position of an object are obtained, wherein the first position and the second position come from Kinect infrared equipment; a first motion trail from the robot to the object is obtained according to the first position and the second position; and a target position is obtained according to the first motion trail so that the robot can move to the target position to grab the object. By means of the object grabbing method, the object positioning accuracy is achieved, and therefore the beneficial effects of grabbing the object more accurately and enlarging the using scene and scope of the robot are achieved, and the technical problems that in the prior art, a method for positioning by using self vision of the robot is low in positioning accuracy and not suitable for the environment with dark light are solved.

Description

technical field [0001] The invention relates to the technical field of intelligent service robots, in particular to a method for a robot to grab objects. Background technique [0002] With the development of artificial intelligence technology, intelligent service robots are widely used in various home services. In a smart home environment, robots can receive instructions from people to grab and deliver needed items. [0003] In the prior art, the method of using a robot to grab an item generally uses a camera on the robot body or an external camera to locate the item, so as to realize the grabbing of the item. [0004] When the inventor of the present application realized the technical solution of the present invention, he found that at least the following problems existed in the prior art: [0005] The existing method of locating objects through the camera of the robot body has relatively high requirements for light, and is only suitable for daytime or places with suitable...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): B25J13/08B25J11/00
CPCB25J13/088B25J11/008
Inventor 王欣伍世虔韩浩邹谜王建勋张俊勇陈鹏杨超
Owner WUHAN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products