Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method and apparatus for driving area detection

A driving area and equipment technology, applied in character and pattern recognition, instruments, computer components, etc., can solve the problems of large environmental impact, inability to accurately detect driving area, and inability to obtain effective and accurate three-dimensional distance information, etc. Usability and robustness, the effect of strong robustness

Active Publication Date: 2019-05-10
深兰人工智能芯片研究院(江苏)有限公司
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention provides a method and equipment for detecting a driving area, which is used to solve the problem in the prior art that the current driving area detection based on a camera sensor cannot obtain effective and accurate three-dimensional distance information, and is greatly affected by the environment and cannot accurately detect the driving area. regional problems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method and apparatus for driving area detection
  • A method and apparatus for driving area detection
  • A method and apparatus for driving area detection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0068] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings. Obviously, the described embodiments are only some embodiments of the present invention, rather than all embodiments . Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0069] The following is an explanation of some words that appear in the text:

[0070] 1. The term "and / or" in the embodiment of the present invention describes the association relationship of associated objects, indicating that there may be three relationships, for example, A and / or B, which may mean: A exists alone, A and B exist simultaneously, and There are three cases of B. The character " / " generally indicates that the contextual ob...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method and equipment for detecting a driving area, relating to the technical field of automatic driving. The method is used for solving the problem that accurate three-dimensional distance information cannot be obtained in current driving area detection, the driving area cannot be accurately detected. The method comprises the steps that first feature information of road surface points and second feature information of road shoulder points in the bird's-eye view feature map are determined through a neural network segmentation model according to the average reflection intensity and height coding features of grids in the bird's-eye view feature map, and the bird's-eye view feature map is obtained by conducting rasterization processing on a point cloud map; road surface points corresponding to the road surface points in the bird's-eye view feature map in the point cloud map is determined according to the first feature information, and road shoulder points corresponding to the road shoulder points in the bird's-eye view feature map in the point cloud map is determined according to the second feature information; the road surface points and the road shoulder points in the point cloud map are subjected to geometric model fitting to determine the driving area, and the driving area is detected by adopting a deep learning method, so that the accuracy is high.

Description

technical field [0001] The invention relates to the technical field of automatic driving, in particular to a method and equipment for detecting a driving area. Background technique [0002] In recent years, with the rapid development of the automobile industry, traffic accidents have become a global problem. It is estimated that the number of deaths and injuries in traffic accidents in the world exceeds more than 500,000 people every year. Born for human driving applications. In order to perceive environmental information in real time and reliably, unmanned vehicles are equipped with various active and passive sensors, including cameras, lidar, millimeter-wave radar and GPS (Global Positioning System, Global Positioning System). One of the key parts in technology. [0003] Image-based driving area detection technology is mainly based on camera sensors. Through image and video analysis technology, such as image pixel-level semantic segmentation technology, the pixel points ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/32G06K9/34G06K9/62
Inventor 陈海波
Owner 深兰人工智能芯片研究院(江苏)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products