Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-lane-line detection method

A technology for lane line and line detection, applied in instrument, calculation, character and pattern recognition, etc., can solve the problems of inability to make full use of lane line-like features, inability to obtain results, inability to robust multi-lane line detection, etc., to achieve high practical Application value, good effect, time-saving effect

Active Publication Date: 2017-08-15
NANJING UNIV OF SCI & TECH
View PDF15 Cites 45 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] However, none of the above-mentioned documents can robustly detect multi-lane lines. The main problems are as follows: first, the entire image is processed uniformly in the image preprocessing stage, which greatly affects the interference information such as illumination and shadow; second Second, if the RANSAC algorithm is used to directly fit the point set to the lane line equation, the strip feature of the lane line in the lane line detection task cannot be fully utilized, so that the RANSAC algorithm cannot obtain better results under limited calculations.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-lane-line detection method
  • Multi-lane-line detection method
  • Multi-lane-line detection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024]The present invention will be described in more detail below in conjunction with specific implementations. The following specific implementations will help developers understand the present invention, but will not limit the present invention in any form. It should be noted that, for those skilled in the art, several modifications and improvements can be made without departing from the concept of the present invention, and these are within the protection scope of the present invention.

[0025] This implementation example provides a multi-lane detection method. First, this paper separates the red channel of the color image collected by the camera as the grayscale image to be processed. Then, according to the installation position of the camera on the autonomous vehicle, the internal parameters of the camera, and the setting The size of the grid map is determined, and the original image is subjected to inverse perspective transformation to obtain the grid map. Secondly, ac...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-lane-line detection method. Color images of continuous frames are obtained, graying processing is carried out on a current frame of image, and inverse perspective transformation is carried out on a grey-scale map based on a camera parameter and a set grid map size, thereby obtaining a grid map of a road part; according to priori information of a control point, region division is carried out on the grid map by using a Thiessen polygon and image binaryzation processing is carried out on the divided regions; region grouping is carried out based on priori information of lane lines and pixel coordinates of non-zero pixel values is recorded; curve fitting is carried out on a grouped point set by using an improved RANSAC algorithm, a current image control point is predicted by combining a lane line equation and a particle filter algorithm, a final lane line equation is calculated and e perspective transformation is carried out on the lane line equation, and thus a lane line equation in an original image is obtained. With the method, lane line detection precision and robustness are high; and multi-lane-line detection can be completed simultaneously.

Description

technical field [0001] The invention belongs to the field of computer vision and automatic driving, and in particular relates to a multi-lane line detection method. Background technique [0002] In the field of automatic driving, the detection of lane lines is an important link, and it is also a hot issue in the field of automatic driving. However, most of the current lane line detection methods based on computer vision cannot robustly detect lane lines. The main reasons are as follows: illumination, shadows on the road, other obstacles on the road (vehicles, pedestrians, etc.), Other information such as traffic markings will interfere with the detection of lane lines on the road. [0003] At present, there are many multi-lane line detection algorithms based on computer vision. Many journals and conferences related to autonomous driving at home and abroad have listed the lane detection algorithm as a key research field. Scholars at home and abroad have also made a lot of b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/34G06K9/46
CPCG06V20/588G06V10/267G06V10/44
Inventor 陈必科杨健宫辰钱建军
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products