A system and method for complementing lidar three-dimensional point cloud targets

A 3D point cloud and lidar technology, applied in the field of lidar object detection and recognition, can solve the problems of camera influence, lack of depth information, etc., achieve enhanced density and uniformity, good completion effect, and improved ability to extract features Effect

Active Publication Date: 2022-04-29
NANJING LES INFORMATION TECH
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the camera is greatly affected by environmental factors, such as rain and fog, night, etc.
In addition, the image data is two-dimensional information and lacks depth information. It is still necessary to use sensors such as radar to obtain information such as distance and angle.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A system and method for complementing lidar three-dimensional point cloud targets
  • A system and method for complementing lidar three-dimensional point cloud targets
  • A system and method for complementing lidar three-dimensional point cloud targets

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0029] This embodiment is based on the 2017 Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition:

[0030] "C.R.Qi, H.Su, K.Mo, and L.J.Guibas. Pointnet: Deep learning on point sets for 3d classification and segmentation. Proc. Computer Vision and Pattern Recognition (CVPR), IEEE, 1(2):4, 2017" proposed program improvements.

[0031] In this embodiment, a system for complementing a lidar three-dimensional point cloud target includes a first coding layer, a second coding layer, and a third coding layer;

[0032] The first coding layer includes the first shared multi-layer perceptron, the first point-wise maximum pooling layer; the second coding layer includes the second shared multi-layer perceptron, the second point-wise maximum pooling layer; the third coding layer includes the first Three shared multi-layer perceptrons, the third point-wise maximum pooling layer;

[0033] In the first coding layer, the input data includes three-dimens...

Embodiment 2

[0041] This embodiment provides a method for complementing the lidar 3D point cloud target:

[0042] Setting the first encoding layer, including the first shared multi-layer perceptron, the first point-wise maximum pooling layer;

[0043] Set the second encoding layer, including the second shared multi-layer perceptron, the second point-wise maximum pooling layer;

[0044] Set the third encoding layer, including the third shared multi-layer perceptron and the third point-wise maximum pooling layer;

[0045] In the first coding layer, the input data includes three-dimensional coordinates of m points, and the data format is a matrix P of m×3, and each row of the matrix is ​​a three-dimensional coordinate pk=(x, y, z) of a point; the input data is first After obtaining the point feature matrix Point feature i by the first shared multi-layer perceptron, each point feature is f 1k ; Then, the point feature matrix Point feature i obtains the global feature matrix Global feature i ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention provides a system and method for complementing a laser radar three-dimensional point cloud target. The global characteristics of the laser radar target are obtained through three layers of shared multi-layer perceptrons and three layers of point-by-point maximum pooling layers, so that the objects obtained by laser radar scanning The point cloud distribution is denser and more uniform, and the details of the outline of the object are more complete, so as to achieve the purpose of detection, recognition and measurement calculation.

Description

technical field [0001] This patent belongs to the technical field of lidar object detection and recognition. Background technique [0002] Since lidar is less affected by environmental factors and can output three-dimensional information, it is more and more widely used in scenarios such as drones and driverless cars. This makes the research and development of lidar and corresponding data processing technology attract the attention of the industry. LiDAR outputs sparse 3D point cloud data, and the scanned objects are incomplete, which poses a great obstacle to target detection and recognition and target size measurement. The existing technology uses lidar scanning to obtain the shape of the object, and usually does not complete the shape, so that the target size is still not accurate enough. There is also the object image data obtained through the camera, and then the detection and recognition are realized through the image algorithm. However, the camera is greatly affect...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G01S17/89
CPCG01S17/89
Inventor 汪明明严璐刘磊顾昕
Owner NANJING LES INFORMATION TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products