Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A multi-camera vision-inertial real-time positioning method and device for a robot

A multi-camera vision, real-time positioning technology, applied in the field of robot navigation, can solve the problems of indistinct visual features, blurred imaging, and high repeatability of feature textures

Active Publication Date: 2021-05-28
ZHEJIANG UNIV
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

When the camera field of view is blocked by obstacles, the visual features are not obvious, and the feature texture is highly repetitive and difficult to match, the event of positioning and loss often occurs
In addition, when the robot moves so fast that the image is blurred, the existing pure visual positioning methods are also incompetent.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A multi-camera vision-inertial real-time positioning method and device for a robot
  • A multi-camera vision-inertial real-time positioning method and device for a robot
  • A multi-camera vision-inertial real-time positioning method and device for a robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] Below, the technical solution of the present invention will be further described in conjunction with the accompanying drawings and specific embodiments:

[0043] figure 1 It is a schematic flow chart of the multi-camera visual-inertial real-time positioning method for a robot of the present invention, and the present invention discloses a multi-camera visual-inertial real-time positioning method for a robot, comprising the following steps:

[0044] Obtain the current multi-eye image and inertial sensor data of the robot;

[0045] Extract image feature points according to the current image, estimate the current robot pose; reconstruct a 3D point cloud according to the current robot pose, store historical and current point cloud data to maintain the visual point cloud map;

[0046] Complete the initialization and estimate the sensor bias value according to the inertial sensor data, and pre-integrate to obtain the current speed and angle of the robot;

[0047] Optimize t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-camera vision-inertial real-time positioning method and device for a robot. Generate 3D point cloud, store historical and current point cloud data to maintain visual point cloud map, complete initialization and estimate sensor offset value according to inertial sensor data, and pre-integrate to obtain current robot speed and angle, according to visual point cloud map, inertial Sensor pre-integration optimizes the current pose and other steps. The multi-eye camera mentioned in the present invention uses the information of multiple viewing angles to provide a wider field of view. Multiple cameras have different orientations, and it is difficult for the entire field of view to be blocked. Moreover, the visual features provided by multiple cameras are more abundant, which can almost guarantee to meet the feature requirements for positioning.

Description

technical field [0001] The invention relates to robot navigation technology, in particular to a multi-camera visual inertial real-time positioning method and device for a robot. Background technique [0002] At present, more and more different types of robots appear in all aspects of production and life. For the fields of warehousing and logistics, inspection and monitoring, etc., the work requires robots to be able to achieve long-term stable operation in a relatively fixed environment, and can realize Precise self-positioning. When the camera's field of view is blocked by obstacles, the visual features are not obvious, and the feature texture is highly repetitive and difficult to match, the event of positioning and tracking often occurs. In addition, when the robot moves so fast that the image is blurred, the existing pure vision positioning methods are also incompetent. Multi-eye cameras use information from multiple perspectives (overlapping or non-overlapping) to prov...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01C21/00G01C21/16G01C11/04
CPCG01C11/04G01C21/005G01C21/165
Inventor 熊蓉傅博王越
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products