Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for filling indoor light detection and ranging (LiDAR) missing data based on Kinect

A technology for missing data and scanning data, applied in the field of indoor LiDAR missing data filling, it can solve the problems of inability to complete data collection in small areas and limited scanning range, and achieve the effect of realizing collection work, low price and maintaining integrity

Inactive Publication Date: 2013-02-20
WUHAN UNIV
View PDF2 Cites 42 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention mainly solves the technical problem that traditional laser scanners have limited scanning range in the prior art, especially for complex indoor scenes, and cannot complete data collection work in narrow areas; it provides a new type of somatosensory device Kinect is cheap, can acquire scene depth and image information at the same time, output at a rate of 30 frames per second, can acquire a large amount of depth information, and can maintain the integrity of collected information, etc., to realize the Kinect-based indoor LiDAR missing data collection for local data collection. method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for filling indoor light detection and ranging (LiDAR) missing data based on Kinect
  • Method for filling indoor light detection and ranging (LiDAR) missing data based on Kinect
  • Method for filling indoor light detection and ranging (LiDAR) missing data based on Kinect

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment

[0059] Step 1, the key frame extraction of the Kinect scanning process, to obtain relatively sparse scanning data. The Kinect device is used to scan the missing areas of the LiDAR device during the single-point scanning process. Since the Kinect device collects data at a speed of 30 frames per second, and there are many repeated areas between adjacent frames, the key frame extraction method is used to obtain local effective scanning data on the premise of ensuring the integrity of missing scene data , to reduce post-data processing time. The present invention determines the addition of key frames by directly calculating the angle deflection and translation of the camera.

[0060] Step 2, feature extraction of Kinect-based RGB images. Since the RGB image and the point cloud data have been registered, the feature points extracted from the RGB image can be mapped to the point cloud data as the features of the point cloud data. Among them, the RGB image realizes the feature ext...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method for filling indoor LiDAR missing data based on Kinect. The method comprises extracting a key frame in a Kinect scanning process to obtain sparse scanned data; using a scale invariant feature transform (SIFT) algorithm to perform feature extraction on a RGB-D image collected by a Kinect device, and rejecting abnormal feature matching points through a random sample concensus (RANSAC) operator; merging the extracted features; extracting features of an LiDAR image, matching the features with the features of the Kinect device roughly to obtain a transfer matrix; using an improved iterative closest point (ICP) algorithm to achieve fine match of the LiDAR image with the RGB-D image of the Kinect; and performing the missing data fusion between an LiDAR model and the part scanned by the Kinect. The method for filling indoor LiDAR missing data based on Kinect has the advantages that the device cost is low, the acquisition process is flexible, scene depth and image information can be obtained, partial or missing data of indoor complex scenes can be acquired and filled rapidly.

Description

technical field [0001] The invention relates to a method for filling missing data of indoor LiDAR, in particular to a method for filling missing data of indoor LiDAR based on Kinect. Background technique [0002] LiDAR (Light Detection And Ranging), called laser radar, is the abbreviation of laser scanning and detection system. LiDAR systems are mainly divided into two categories: airborne LiDAR systems and ground LiDAR systems. The present invention is mainly aimed at ground LiDAR systems in complex indoor environments. Compared with the traditional image-based 3D reconstruction, the 3D reconstruction based on 3D laser scanning has the advantages of fast, accurate and non-contact. Due to the discontinuity of the laser scanner, in order to convert the coordinates of the point cloud data acquired by multiple scanning points into a unified coordinate system to form a completed point cloud model, image registration is required. Image registration technology is mainly divided ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50G06T7/00
Inventor 呙维胡涛朱欣焰水淼樊亚新
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products