Multi-level scene reconstruction and rapid segmentation method, system and device for narrow space

A scene reconstruction and multi-level technology, applied in image analysis, image data processing, instruments, etc., can solve the problem that robot scene reconstruction and segmentation cannot take into account both reconstruction accuracy and real-time calculation, so as to speed up reconstruction and improve real-time performance Effect

Active Publication Date: 2021-01-08
INST OF AUTOMATION CHINESE ACAD OF SCI +2
View PDF5 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the above-mentioned problems in the prior art, that is, the reconstruction and segmentation of the robot scene in a narrow space cannot take into account both the reconstruction accuracy and the real-time calculation, the present invention provides a multi-level scene reconstruction and fast segmentation method in a narrow space. include:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-level scene reconstruction and rapid segmentation method, system and device for narrow space
  • Multi-level scene reconstruction and rapid segmentation method, system and device for narrow space
  • Multi-level scene reconstruction and rapid segmentation method, system and device for narrow space

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0059] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, not to limit the invention. It should also be noted that, for the convenience of description, only the parts related to the related invention are shown in the drawings.

[0060] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present application will be described in detail below with reference to the accompanying drawings and embodiments.

[0061] The present invention provides a multi-level scene reconstruction and rapid segmentation method in a narrow space. Aiming at the environmental characteristics of a narrow space, a corresponding multi-level dynamic scene real-time construction scheme is proposed, which can ensu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of robot scene reconstruction, particularly relates to a multi-level scene reconstruction and rapid segmentation method, system and device for a narrow space, and aims to solve the problem that the reconstruction precision and calculation real-time performance of robot scene reconstruction and segmentation in the narrow space cannot be considered at the same time. The method comprises the following steps: taking a color image, a depth image, camera calibration data and robot spatial position and attitude information; converting sensor data into a single-framepoint cloud through coordinate conversion; dividing scales of the single-frame point cloud, carrying out ray tracing and probability updating to acquire a multi-level scene map after scale fusion; and performing downsampling twice and upsampling once on the scene map, performing lossless transformation by means of scales, and establishing a plurality of sub-octree maps based on a space segmentation result, thereby realizing multi-level scene reconstruction and rapid segmentation. On the premise that necessary details of the scene are not lost, dense reconstruction and algorithm acceleration are achieved, and application to actual engineering occasions is better facilitated.

Description

technical field [0001] The invention belongs to the field of robot scene reconstruction, and in particular relates to a multi-level scene reconstruction and rapid segmentation method, system and device in a narrow space. Background technique [0002] The navigation and decision-making of the robot in an unknown environment needs to rely on the perception of the environment. Specifically, it needs to combine the positioning information of the robot to reconstruct the three-dimensional environment of the operation scene, and divide the spatial composition of the operation scene from the three-dimensional environment. [0003] Depending on the type of map, scene reconstruction can be divided into two types: sparse reconstruction and dense reconstruction. The former generally only establishes a scene through the landmark points used for positioning, while the latter establishes a scene through all the measurement points of the sensor. In visual odometry, positioning and mapping ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/80G06T7/10
CPCG06T7/85G06T7/10G06T2207/10024
Inventor 罗明睿李恩郭锐刘佳鑫杨国栋梁自泽谭民李勇刘海波李胜川周桂平
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products