Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Vision-Based Perception Method for External State of Spatial Cellular Robot

A state-aware, robotics technology, applied in manipulators, program-controlled manipulators, manufacturing tools, etc., can solve problems such as method failure, larger image changes in the target area, and reduced depth uncertainty, achieving a wide range of applicability.

Active Publication Date: 2021-11-02
HARBIN INST OF TECH
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Although the traditional infrared sensor or ultrasonic sensor can realize the distance measurement, it cannot determine the type of the measurement target, and when measuring the relative rotation angle, it is necessary to install the corresponding infrared sensor on the measured object, and determine the angle by changing the range of the received signal. The detectable range of the angle is small; the distance measurement based on monocular vision generally adopts the triangulation method, but this method needs to initialize the translation of the camera before the measurement, and the distance cannot be calculated by the triangulation method for a monocular camera with pure rotational motion. And in the initialization process, if the translation distance is too small, it will lead to a large depth uncertainty and reduce the accuracy, while a large translation distance will cause the image of the target area to change greatly, which may invalidate the method, and the relative rotation angle cannot be measured.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Vision-Based Perception Method for External State of Spatial Cellular Robot
  • A Vision-Based Perception Method for External State of Spatial Cellular Robot
  • A Vision-Based Perception Method for External State of Spatial Cellular Robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0044] Embodiment one: if Figure 1-4 As shown, a vision-based spatial cell robot external state perception method involved in this embodiment, the specific steps are:

[0045] (1) Monocular camera calibration of space cell robot

[0046] Based on the pinhole camera model, the internal and external parameter model of the camera is derived, as shown in formula (1):

[0047]

[0048] where [X,Y,Z] T is the coordinate of point P in the world coordinate system, [u, v] T is the coordinate of P in the pixel coordinate system, α and β are the magnification factors from the pixel coordinate system to the image plane coordinate system on the u-axis and v-axis respectively, [c x ,c y ] T is the translation vector from the pixel coordinate system to the image plane coordinate system, f is the focal length of the space cell robot USB camera, f x = αf and f y =βf are respectively the magnification factors from the pixel coordinate system to the world coordinate system along the x...

Embodiment 2

[0073] Embodiment two: if Figure 1-4 As shown, in the simulation verification process of a vision-based external state perception method of a space cell robot involved in this embodiment, the USB camera of the space cell robot is calibrated by the Zhang Zhengyou calibration method, and the 6×9 checkerboard calibration board is used to calibrate the USB camera. Take pictures of the chessboard calibration board from different angles, and select 20 pictures for corner point extraction. The results of calibration by Zhang Zhengyou calibration method are shown in the following table:

[0074] Table 1 Calibration results of camera parameters

[0075]

[0076] Select 200 spatial cell robot images from different angles to establish a spatial cell robot image database. Carry out further feature region labeling processing on the images in the image database, and label the parts to be identified. The data containing the feature label frame includes the image information of each pict...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a vision-based method for sensing the external state of a space cell robot, which belongs to the field of space cell robot distance measurement. The present invention establishes the image database of different types of cells of the space cell robot, performs image labeling on the connection surface that needs to be identified and autonomously connected to the database image, and performs network training based on deep learning on the marked image, and uses the connection surface in the pixel coordinate system The position in the camera, as well as the conversion relationship between the pixel coordinate system, the image plane coordinate system and the world coordinate system, combined with the internal parameters of the camera, the relative distance and angle relationship between the active connection surface and the passive connection surface are deduced. The invention can not only measure the relative distance and relative rotation angle with the target object, but also identify different types of target objects and obstacles according to the requirements of the space cell robot, and the angle measurement range is also larger, which is suitable for any translation of the space cell robot Or external state perception under rotational motion.

Description

technical field [0001] The invention relates to a vision-based external state perception method of a space cell robot, which belongs to the field of self-reconfiguration of the space cell robot. Background technique [0002] In order to meet the needs of different space on-orbit tasks, the space cell robot can complete the configuration change of the robot group through self-reconfiguration, so as to achieve the purpose of expanding the function of the unit cell and expanding the working space of the robot. The method of self-reconfiguration of the space cell robot is through the autonomous connection between the unit cells. The key is to measure the relative distance and angle between the active and passive connection surfaces, and then issue corresponding control instructions. The external state perception of the space cell robot is a method to determine the relative position and attitude between the space cell robot and the external environment, targets and obstacles, and...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16
CPCB25J9/1617B25J9/1697
Inventor 安德孝蔡映凯田浩刘育强谢旭东
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products