Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Lane line classification method and system adopting cascade network

A technology of cascade network and classification method, applied in the field of computer vision/lane line detection, can solve the problems of unfavorable real-time performance, time-consuming, large memory, etc. sexual effect

Active Publication Date: 2021-08-06
HEFEI INSTITUTES OF PHYSICAL SCIENCE - CHINESE ACAD OF SCI
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The technical problem to be solved by the present invention is that the prior art lane line classification method and system send the lane line information to the decision-making in a pixel-level manner, which takes up a lot of memory and takes a lot of time, which is not conducive to the real-time performance of the automatic driving system.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Lane line classification method and system adopting cascade network
  • Lane line classification method and system adopting cascade network
  • Lane line classification method and system adopting cascade network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0048] A method for classifying lane lines using a cascaded network, the method comprising:

[0049] Step S1: preprocess the ERFNet network to obtain the UERFNet network and train the UERFNet network, and use the trained UERFNet network as the lane line positioning network; select ERFNet as the benchmark network, and the ERFNet network includes 23 layers, of which 1-16 layers are encoders. device), 17-23 layers are decoder (decoder). In order to make the scale extracted by the network reach the size of the original image, the deocder stage of ERFNet directly uses the upsampling operation to reach the size of the original image. Among them, the ERFNet network is preprocessed, including:

[0050] Upsampling the output of the 16th layer of the ERFNet network, the upsampling result is input to the 17th layer of the ERFNet network, and the output of the first layer of the ERFNet network is input to its 23rd layer, and the output of the second layer of the ERFNet network is input ...

Embodiment 2

[0080] Corresponding to Embodiment 1, Embodiment 2 of the present invention also provides a lane line classification system using a cascade network, and the system includes:

[0081] The lane line positioning network building block is used to preprocess the ERFNet network to obtain the UERFNet network and train the UERFNet network, and use the trained UERFNet network as the lane line positioning network;

[0082] The position point acquisition module is used to input the image of the lane line to be classified into the lane line positioning network to obtain the position point of each lane line;

[0083] The feature extraction module is used to extract the pixel value corresponding to the position point of each lane line in the original image to form a feature map of each lane line;

[0084] Lane classification network building block, used for cascading multiple bottleneck layers and fully connected layers to build a lane classification network;

[0085] The lane category acq...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a lane line classification method and system adopting a cascade network, and the method comprises the steps: carrying out the preprocessing of an ERFNet network, obtaining a UERFNet network, training the UERFNet network, and taking the trained UERFNet network as a lane line positioning network; inputting the image of the to-be-classified lane line into a lane line positioning network to obtain a position point of each lane line; extracting a pixel value corresponding to the position point of each lane line in the original image to form a feature map of each lane line; cascading the plurality of bottleneck layers and the full-connection layers to construct a lane classification network; and inputting the feature map of each lane line into a lane line classification network, and outputting the category of the lane. The invention has the advantages that the processing time of each image is very short, and the real-time performance of the automatic driving system is guaranteed.

Description

technical field [0001] The present invention relates to computer vision / lane line detection, and more particularly to a lane line classification method and system using a cascaded network. Background technique [0002] Self-driving cars acquire environmental data through on-board vision cameras, sensors and other devices, and then use computers to process the collected data to perceive and recognize environmental information, thereby automatically controlling and adjusting the driving speed and direction of the vehicle to avoid other vehicles. Among them, the lane line is the basic sign of the road, which can ensure the safe and orderly driving of the vehicle. Correct lane line recognition can enable the self-driving car to make further decisions and judgments on its location and state, thereby ensuring that the vehicle can drive in a safe state. However, so far, research on lane line recognition is limited to good weather conditions and simple road conditions. Moreover, t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V20/588G06N3/045G06F18/2415Y02T10/40
Inventor 孔斌张露王灿杨静
Owner HEFEI INSTITUTES OF PHYSICAL SCIENCE - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products