Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-view point cloud fusion method based on projection

A fusion method, multi-view technology, applied in image data processing, instruments, 3D modeling, etc., can solve the problem of multi-view point cloud overlap, achieve the effect of consistent point cloud density, speed up processing, and avoid gaps

Active Publication Date: 2019-08-16
SOUTHEAST UNIV
View PDF2 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the problem of partial overlapping of multi-view point clouds, the present invention proposes a projection-based multi-view point cloud fusion method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-view point cloud fusion method based on projection
  • Multi-view point cloud fusion method based on projection
  • Multi-view point cloud fusion method based on projection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] Below in conjunction with accompanying drawing, technical scheme of the present invention is described in further detail:

[0052] Under the Windows operating system, Visual Studio Community is selected as a programming tool to process the collected images and reconstructed point clouds. In this example, an object with a complex shape is used as the measured object, which proves the effectiveness of the measurement method proposed in this patent. It should be understood that these examples are only used to illustrate the present invention and are not intended to limit the scope of the present invention. After reading the present invention, those skilled in the art all fall within the appended claims of the present application to the modifications of various equivalent forms of the present invention. limited range.

[0053] A projection-based multi-view point cloud fusion method of the present invention, such as figure 1 As shown, it specifically includes the following s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-view point cloud fusion method based on projection, and the method comprises the steps: enabling adjacent projectors to project pure white patterns to a measured objectseparately, extracting the pixel point positions of the overlapped areas of the adjacent projectors in the image, and enabling the pixel points to be reconstructed to a three-dimensional space to obtain the point cloud of the overlapped areas; using the Kd-Tree to calculate the average shortest distance sigma of the point clouds in the non-overlapping region as a reference point interval, and then for point clouds in overlapping region, calculating nearest neighbor point of point clouds in adjacent point clouds by using Kd-Tree; combining distance and normal vector judgment, judging the overlapped points as corresponding overlapped points when the point distance is less than 2 sigma and the normal vector direction is consistent, finally calculating the distance between the corresponding overlapped points; if the distance sigma is less than the normal vector direction, performing merging by adopting progressive weighting, and determining a weighting coefficient according to the distance from the point on the three-dimensional point back projection image to the boundary of the overlapped region; and if the distance is greater than sigma, interpolating the midpoints of the two pointsby adopting natural adjacent point interpolation.

Description

technical field [0001] The invention relates to a projection-based multi-view point cloud fusion method, which belongs to the field of three-dimensional reconstruction in computer vision. Background technique [0002] The 3D measurement system based on grating projection has the advantages of simple hardware configuration, high measurement accuracy, high point density, high speed, and low cost, so it has always been a dynamic field. However, multiple cameras can effectively solve problems that cannot be solved by a single camera, and the application scenarios are more extensive. There are often common parts between the point clouds reconstructed by multiple cameras. Due to calculation errors and measurement noises, there are registration errors. The points of the common parts from two different point clouds will overlap instead of completely fitting, or even Slight delamination occurs, making subsequent application difficult. Modeling a large number of redundant points is ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/00G06T5/50G06T7/187G06T17/00
CPCG06T17/00G06T5/50G06T7/187G06T2207/10028G06T2207/20221G06T5/70
Inventor 达飞鹏黄林
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products