Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor three-dimensional scene reconstruction method employing plane characteristics

A technology for three-dimensional scenes and indoor scenes, applied in the field of three-dimensional scene reconstruction, can solve the problems of inaccurate matching point sets, reduced accuracy, and deviation of transformation matrices, and achieves high robustness, good speed, and good accuracy. Effect

Inactive Publication Date: 2016-08-31
NORTHEASTERN UNIV
View PDF4 Cites 50 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In 3D point cloud registration, the mismatching points in the feature matching process will lead to inaccurate matching point sets, which will make the final calculated transformation matrix deviate
[0004] To sum up, the current indoor 3D scene reconstruction method relies heavily on the initial position of the point cloud during point cloud registration, resulting in a decrease in accuracy
However, the later optimization method uses the loopback comparison between multi-frame point clouds, which increases the difficulty of calculation and reduces the real-time performance of 3D reconstruction.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor three-dimensional scene reconstruction method employing plane characteristics
  • Indoor three-dimensional scene reconstruction method employing plane characteristics
  • Indoor three-dimensional scene reconstruction method employing plane characteristics

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] The specific implementation manners of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0050]In this embodiment, a laboratory with a relatively complex environment is selected as the reconstructed indoor scene, and a Kinect camera with an image resolution of 640×480 is used. The experimental program combined with the PCL point cloud library uses C++ to realize this method under the Ubuntu system. This implementation mode is run on a computer with Intel dual-core 2.93GHz CPU. In order to verify the real-time and stability of this method, the hand-held Kinect camera is used to collect data freely in the scene.

[0051] Using the Kinect camera as a collection tool for color images and depth images, RGB images and depth images are first obtained through the Kinect camera. Then, feature extraction is performed on the color image, and a preliminary rotation matrix is ​​established. At the same time, the single-frame r...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an indoor three-dimensional scene reconstruction method employing plane features, and the method comprises the steps: obtaining an RGB image and a depth image of an indoor scene in real time, and completing the reconstruction of a single-frame three-dimensional point cloud; carrying out the feature extraction of two adjacent RGB images, and obtaining the initial rotating matrixes of the two adjacent three-dimensional point clouds; carrying out the downsampling of each three-dimensional point cloud, and extracting the plane features of the indoor scene from each three-dimensional point cloud; determining each plane position; calculating an error rotating matrix; correcting the initial rotating matrixes, and carrying out the jointing and registering of each two three-dimensional point clouds; and finally achieving the reconstruction of the indoor three-dimensional scene through the jointing and registering of each three-dimensional point cloud. The method carries out the error elimination through employing the geometric features of the point clouds, and extracts the plane features of the point clouds quickly and effectively. The success rate of the matching of the plane features of the current and former point clouds is higher. According to the plane features, the method judges the type of the planes, calculates the error matrix, corrects the initial rotating matrix, and obtains a more accurate indoor three-dimensional point cloud map.

Description

technical field [0001] The technical field of three-dimensional scene reconstruction that the present invention relates to, in particular, relates to an indoor three-dimensional scene reconstruction method using plane features. Background technique [0002] Vision-based 3D reconstruction technology refers to the use of digital cameras as image sensors, the comprehensive use of image processing, visual computing and other technologies for non-contact 3D measurement, and the acquisition of 3D information of objects through computer programs. Many related engineering technology researches need to carry out relevant analysis and calculation on the three-dimensional environment or objects, so as to obtain useful digital information more intuitively, so as to guide related engineering calculations. Therefore, 3D scene reconstruction is being increasingly used in industries, disaster rescue, autonomous navigation of mobile robots, service systems, augmented reality and other fields...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T17/00
CPCG06T17/00G06T2200/08
Inventor 吕忠元刘洋郑佳吴成东
Owner NORTHEASTERN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products