Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Camera and three-dimensional laser radar data fusion road edge detection method

A three-dimensional laser and detection method technology, applied in the field of intelligent traffic road environment perception, can solve the problem that the amount of information cannot provide three-dimensional information, achieve accurate and reliable road information, meet the real-time performance, and improve the accuracy.

Inactive Publication Date: 2020-09-22
安徽卡思普智能科技有限公司
View PDF0 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The camera has a high amount of information but cannot provide accurate three-dimensional information. Although the lidar can obtain accurate three-dimensional information, the point cloud is discrete. Due to the influence of obstacles in the road environment, many false positive roadside points are detected.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Camera and three-dimensional laser radar data fusion road edge detection method
  • Camera and three-dimensional laser radar data fusion road edge detection method
  • Camera and three-dimensional laser radar data fusion road edge detection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] The specific technical solutions of the embodiments of the present invention will be further described below in conjunction with the accompanying drawings.

[0041] The embodiment of the present invention selects VelodyneLiDARPuck 16-line laser radar to collect point cloud data, grayscale camera to collect image data, and proposes a roadside detection method for camera and three-dimensional laser radar data fusion, such as figure 1 shown, including the following steps:

[0042] Step 1: Establish a lidar coordinate system with the 3D lidar as the origin, then use the 3D lidar to collect road point cloud data sets, and convert each point cloud data in the road point cloud data set from 3D space to 2D grid on the map;

[0043] Specifically, the method for converting point cloud data from 3D space to 2D grid map includes:

[0044] The three-dimensional laser radar Cartesian coordinate system takes the center of the radar internal rotating mirror as the origin, and the ang...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a camera and three-dimensional laser radar data fusion-based road edge detection method. The method comprises the following steps of 1) establishing a grid map by taking a three-dimensional laser radar as an original point; 2) extracting lane line feature points through a camera by adopting an existing method, and measuring transverse and longitudinal distances of the feature points relative to an original point of the camera on an actual road; 3) acquiring road edge candidate points by using a method based on adjacent point height difference characteristics, 4) projecting the lane line feature points to a grid map according to the position relationship between the camera and the laser radar, and fitting by using a quadratic curve model, 5) judging the road edge candidate points within a fixed distance threshold from the lane line curve as road edge points, and 6) filtering the road edge points by using an RANSAC algorithm, and fitting the road edge points in combination with a quadratic curve model. The method can improve detection accuracy of the road edge, and thereby the accurate and reliable road boundary information is provided for the unmanned vehicle, and the occurrence rate of road traffic accidents is reduced.

Description

technical field [0001] The invention belongs to the field of intelligent traffic road environment perception, and in particular relates to a roadside detection method for fusion of camera and three-dimensional laser radar data. Background technique [0002] In recent years, the number of cars and drivers in my country has increased year by year, and the transportation industry has continued to develop. At the same time, the total number of traffic accidents has increased year by year. Cars have been closely linked with people's lives. In order to reduce the death rate caused by traffic accidents, research on unmanned vehicles has attracted more and more attention. In intelligent transportation, unmanned vehicles can accurately perceive the environment in real time and provide drivers with more environmental information, thereby reducing the frequency of traffic accidents. [0003] Road boundary detection is an important part of environment perception. Road color, road surfa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01S17/86G01S17/931
CPCG01S17/86G01S17/931
Inventor 魏振亚陈无畏张先锋王庆龙崔国良丁雨康马聪颖
Owner 安徽卡思普智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products