Robot system

A robot system and robot technology, applied in the field of robot systems, can solve problems such as the inability to clamp objects and the inability to judge the position command of the robot arm, and achieve the effect of reducing the risk of crushing or slipping

Inactive Publication Date: 2010-06-16
YASKAWA DENKI KK
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0021] However, since the robot system in the prior art of Patent Document 1 does not use image information, if the gripped object is not at a predetermined position, the position command of the robot arm cannot be judged, and the problem of not being able to grip the object arises.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot system
  • Robot system
  • Robot system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0070] figure 1 It is an overall configuration diagram showing the configuration of a robot implementing the present invention.

[0071] exist figure 1 Among them, 1 is the hand force sensor, 2 is the hand, 3 is the arm, 4 is the torso force sensor, 5 is the torso, 6 is the camera, 7 is the robot control part, 8 is the image processing part, 10 is the arm Department force sensor, 11 is moving mechanism.

[0072] Below, use figure 1 The overall configuration of a robot embodying the present invention will be described.

[0073] The trunk 5 of the robot includes two arm parts 3 and a trunk force sensor 4 for measuring the load on the trunk 5 is arranged. Each arm part 3 has a hand part 2 at the front end. The hand 2 includes five fingers having the hand force sensor 1 at the tip. Two cameras 6 installed on the torso 5 of the robot measure the shape of the gripped object. The robot control unit 7 controls the movement of the arm unit 3 . The image processing unit 8 of ...

Embodiment 2

[0099] Figure 10 is a configuration diagram showing a robot system according to a second embodiment of the present invention.

[0100] exist Figure 10 Among them, 9 is a storage unit. Additionally, with figure 1 The same explanatory notation indicates the same as figure 1 The description of the same constituent elements is omitted.

[0101] The difference between this embodiment and the first embodiment is that the robot system of this embodiment is equipped with, when the clamped object is clamped, the inherent attribute information related to the clamped object, that is, the size, shape, and mass of the clamped object The storage unit 9 stores and maintains one or more pieces of data related to the gripping object, such as color depth and data related to the gripping method, as gripping object data.

[0102] Fig. 11 is a flowchart illustrating the operation of the robot system according to the second embodiment of the present invention. In addition, the same step S...

Embodiment 3

[0111] The structure of the robot system in this embodiment is the same as that of the first embodiment figure 1 , figure 2 are the same, so its description is omitted.

[0112] The difference between the robot system of this embodiment and the first embodiment is that this embodiment places the operation of determining the gripping position based on the size and shape of the gripped object obtained from the image of the camera 6, that is, step ST41, in the first embodiment. between step ST11 and step ST12.

[0113] Fig. 12 is a flowchart illustrating the operation of the robot system according to the third embodiment of the present invention. In addition, the same step ST numbers as in FIG. 4 showing the first embodiment indicate the same processing contents as in FIG. 4 .

[0114] Next, the operation in the robot system of this embodiment will be described with reference to FIG. 12 .

[0115] In step ST11, the size and shape of the gripping object are calculated using ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a robot system capable of safely and stably holding an object with an optimum holding method and holding force. A robot control section (7) has an object information calculationsection (21) for calculating, based on image information from an image processing section (8), the size and shape of an object to be held; a holding method determination section (22) for determining,based on the object information calculated, a method for holding the object; a holding execution section (23) for executing lifting of the object by the holding method determined; a sensor information processing section (24) for processing pieces of sensor information and controlling holding force, the pieces of sensor information being those obtained at the time of the execution, the processingof the pieces of sensor information being made for each combination of one or more of the pieces of information; and a holding method correction section (25) for correcting, based on the result of theprocessing, the pieces of sensor information, the method of holding the object.

Description

technical field [0001] The present invention relates to a robot system including a robot arm equipped with a robot hand, and more particularly to a robot system that performs clamping based on image and sensor information. Background technique [0002] The existing robot system constitutes the following device: Use the contact sensor, position sensor, force sensor, and offset sensor of the hand to correct the hand drive output signal, and then judge whether the gripping object has been realized through the output of each sensor, so as to realize Accurate clamping (for example, refer to Patent Document 1). [0003] In addition, in order to grasp the position of the grasped object, the image information from multiple cameras is used to simulate the posture and position of the hand when grasping the object, and the relative relationship between the grasped object and the hand is evaluated based on a certain index. The optimal hand posture and position are selected among all re...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): B25J13/08B25J5/00B25J15/08
CPCG05B2219/40607B25J15/0009G05B2219/39523B25J5/00G05B2219/40564B25J13/085B25J9/1612B25J13/084B25J15/08G05B19/02
Inventor 中元善太松熊研司半田博幸
Owner YASKAWA DENKI KK
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products