Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Point cloud multi-view feature fusion method and device

A feature fusion and multi-view technology, applied in the field of point cloud processing, can solve the problems of sparse rectangular grid information, inconsistency of perception scale, and influence on fusion effect, etc., achieve uniform and dense information distribution, facilitate feature extraction and fusion, and improve features The effect of the fusion effect

Pending Publication Date: 2021-04-06
深兰人工智能(深圳)有限公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] To sum up, the existing point cloud multi-view feature fusion method fuses the grid features of the front view and the top view. The scales of these two grid features are inconsistent, and the fusion effect is not good.
In addition, due to the characteristics of the point cloud being close to dense and far away, the rectangular grid information of the top view grid is very sparse, which affects the fusion effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Point cloud multi-view feature fusion method and device
  • Point cloud multi-view feature fusion method and device
  • Point cloud multi-view feature fusion method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0085] In order to make the purpose, technical solutions and advantages of this application clearer, the technical solutions in this application will be clearly and completely described below in conjunction with the accompanying drawings in this application. Obviously, the described embodiments are part of the embodiments of this application , but not all examples. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of this application.

[0086] Combine below figure 1 Describe the point cloud multi-view feature fusion method of the present application, the method includes: Step 110, according to the vertical field of view range and the horizontal field of view range of the point cloud to be fused, construct the three-dimensional polar coordinate space of the point cloud to be fused, and The three-dimensional polar coordinate space is divided into a three-...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention relates to the technical field of point cloud processing, and provides a point cloud multi-view feature fusion method and device, and the method comprises the steps: constructing a three-dimensional polar coordinate space of a to-be-fused point cloud according to the vertical view field range and horizontal view field range of the to-be-fused point cloud, dividing the three-dimensional polar coordinate space into three-dimensional polar coordinate grids; mapping each point in the to-be-fused point cloud to a front view grid and a top view grid of the three-dimensional polar coordinate grid, and performing statistics on features of points in each front view grid and each top view grid to obtain statistical features of each front view grid and each top view grid; and taking the statistical characteristics of the front view grid and the top view grid as the input of the neural network, outputting the extraction characteristics of the front view grid and the top view grid, and fusing the extraction characteristics of the front view grid and the top view grid to obtain a fusion result. According to the embodiment of the invention, the point cloud multi-view feature fusion effect is improved.

Description

technical field [0001] The present application relates to the technical field of point cloud processing, in particular to a point cloud multi-view feature fusion method and device. Background technique [0002] The 3D object detection method that takes point cloud as input generally uses mature convolutional neural network in image to extract the features of point cloud. Therefore, it is necessary to overcome the disorder of the point cloud and the difficulty that the convolutional neural network needs regular size input. Constructing point cloud raster features as the input of convolutional neural network is the mainstream feature construction method. However, the constructed grid features are very sparse, so the usual method is to use multi-frame superposition or multi-view feature fusion to increase feature information. [0003] The MVLidarNet (Multi-View LidarNet, multi-view lidar) algorithm uses the front view and top view grid features to map the point cloud, in whic...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T17/20G06T7/593G06T7/35G06K9/62
CPCG06T17/205G06T7/596G06T7/35G06T2207/10028G06V2201/07G06F18/253
Inventor 陈海波李忠蓬
Owner 深兰人工智能(深圳)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products