Visual and laser radar multi-level fusion-based lane detection method and system

A technology of laser radar and lane detection, which is applied in the direction of radio wave measurement system, measurement device, electromagnetic wave reradiation, etc.

Active Publication Date: 2020-06-12
TSINGHUA UNIV
View PDF12 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to overcome the deficiencies of the prior art, and propose a lane detection method based on the multi-level fusion of vision and laser radar, which combines the laser radar point cloud and camera image for lane detection, and uses the point cloud as image space information It uses images to make up for the defect of low sampling density of point clouds, and improves the robustness of lane detection algorithms in complex road scenes such as uphill lanes, uneven lighting, heavy fog, and night

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual and laser radar multi-level fusion-based lane detection method and system
  • Visual and laser radar multi-level fusion-based lane detection method and system
  • Visual and laser radar multi-level fusion-based lane detection method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 2

[0087] Embodiment 2 of the present invention proposes a lane detection system based on multi-level fusion of vision and lidar, the system includes: lidar, vehicle camera and lane detection module; the lane detection module includes: semantic segmentation network 3D-LaneNet, A marking unit, a first lane candidate area detection unit, a second lane candidate area detection unit and a lane fusion unit;

[0088] Lidar is used to obtain point cloud data;

[0089] The on-board camera is used to obtain video images;

[0090] A calibration unit is used to calibrate the obtained point cloud data and video images;

[0091] The first lane candidate area detection unit is used to fuse the height information of the point cloud data, the reflection intensity information and the RGB information of the video image to construct a point cloud clustering model, obtain the lane point cloud based on the point cloud clustering model, and carry out the lane point cloud The least squares method is ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a visual and laser radar multi-level fusion-based lane detection method and system, the method is realized by installing a laser radar and a vehicle-mounted camera on a vehicle, and the method comprises the following steps: calibrating obtained point cloud data and video images; fusing the height information of the point cloud data, the reflection intensity information andthe RGB information of the video image to construct a point cloud clustering model, obtaining a lane point cloud based on the point cloud clustering model, performing least square fitting on the lanepoint cloud to obtain a lane curved surface, and obtaining a first lane candidate area; fusing the reflection intensity information in the point cloud data with the RGB information of the video imageto obtain four-channel road information; inputting a pre-trained semantic segmentation network 3D-LaneNet, and outputting an image of a second lane candidate region; and fusing the first lane candidate region and the second lane candidate region, and taking a union set of the two lane candidate regions as a final lane region. According to the method, the accuracy of lane detection in a complex road scene is improved.

Description

technical field [0001] The invention relates to the technical field of automatic driving, in particular to a lane detection method and system based on multi-level fusion of vision and laser radar. Background technique [0002] Lane detection in road scenes is a key technical link to realize automatic driving of vehicles to ensure that vehicles drive within the lane limits and avoid collisions with targets such as pedestrians outside the lane due to crossing the lane. And the subsequent detection of lane lines in the effective lane area will be faster and more accurate, so that the vehicle can safely and automatically drive in the correct lane. [0003] It is relatively easy for humans to identify lanes on the road, but in complex scenes such as strong light, heavy fog, and night, human lane identification capabilities are still limited. In order to realize automatic driving, it is necessary to realize accurate detection of lanes in complex scenes. Most existing lane detect...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G01S17/87G01S17/93G01S7/48G06V10/56G06V10/764
CPCG01S17/87G01S7/4802G01S7/4808G06V20/588G06F18/25G01S17/86G01S17/931G06V10/454G06V10/56G06V10/82G06V10/764G06V10/806G06F18/2413G06F18/253G06F18/23G06F18/214G06F18/251
Inventor 张新钰李志伟刘华平李骏李太帆周沫谭启凡
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products