Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target detection method based on 3D laser radar and image data

A technology of laser radar and image data, applied in the direction of measuring devices, electromagnetic wave re-radiation, radio wave measurement systems, etc., can solve problems such as low precision and poor robustness

Active Publication Date: 2018-12-28
CHANGAN UNIV
View PDF9 Cites 64 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a target detection method based on 3D laser radar and image data, making full use of the advantages of 3D laser radar that can directly obtain high-precision target depth and geometric characteristic parameters of the target and image target classification, and realize complementary advantages , overcome the problems of low precision and poor robustness in target detection with a single sensor, and ensure the safe driving of unmanned vehicles under complex conditions to the greatest extent

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target detection method based on 3D laser radar and image data
  • Target detection method based on 3D laser radar and image data
  • Target detection method based on 3D laser radar and image data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0072] In order to make the purpose, technical solutions and advantages of the present invention more clearly understood, in conjunction with specific embodiments, such as figure 1 As shown, the method includes the following steps:

[0073] Step 1, using the 3D lidar and camera installed on the vehicle to acquire 3D point cloud data and camera images of the surrounding environment, and preprocessing the 3D point cloud data.

[0074] In the embodiment of the present invention, the selected 3D laser radar (hereinafter referred to as radar) model is Velodyne HDL-32E, and the radar is installed on the top of the vehicle; in this embodiment, the vehicle is an unmanned vehicle, and the installation height is 2.1m; During the driving process of the unmanned vehicle, the surrounding environment is scanned by radar to obtain 3D point cloud data, such as figure 2 As shown in (a), it is a frame of original point cloud data obtained through the radar scanning environment; the radar cons...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target detection method based on 3D laser radar and image data. The method comprises the following steps: acquiring 3D point cloud data and camera images of the surrounding environment by using 3D laser radar and camera, and pre-processing the 3D point cloud data; filtering out ground points in the 3D point cloud data, performing a spatial clustering on the remaining non-ground points, and extracting the 3D interest region of the target; calibrating the external parameters of the coordinates of the 3D laser radar and the camera, and mapping the 3D interest region of the target to the corresponding camera image according to the calibrated parameters, and extracting the corresponding 2D interest region in the camera image; performing a feature extraction on the 2D interest region using a deep convolution network to locate and identify objects in the 2D interest region. According to the target detection method based on 3D laser radar and image data, the complementarity between the 3D laser radar and the camera data is fully utilized, the accuracy and timeliness of the target positioning and classification recognition of the scene are improved, and the invention can be used for real-time detection of the target in an unmanned vehicle.

Description

technical field [0001] The invention relates to multi-sensor information fusion, in particular to a target detection method based on 3D lidar point cloud target candidate area extraction and image convolutional neural network classification, which is used as an important part of unmanned vehicle environment perception, and is useful for improving the surrounding area of ​​the vehicle. The detection accuracy of the target is of great significance to ensure the safe driving of unmanned vehicles. Background technique [0002] Self-driving cars could radically improve the safety and comfort of the driving population while reducing the car's environmental impact. In order to develop such a vehicle, the perception system is one of the indispensable components for the vehicle to analyze and understand the driving environment, including the location, orientation and classification of surrounding obstacles. [0003] 3D lidar is one of the most popular sensors for autonomous vehicle ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01S17/89G01S17/02
CPCG01S17/86G01S17/89
Inventor 赵祥模孙朋朋徐志刚王润民李骁驰闵海根尚旭明吴霞王召月
Owner CHANGAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products