Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-laser radar fusion method based on vehicle-road cooperation

A fusion method and vehicle-road collaboration technology, applied in the automotive field, can solve problems such as low accuracy and effectiveness of target and obstacle perception, inability to detect distant or occluded targets, and inability to respond to emergency scenes, so as to improve accuracy. degree, the effect of improving efficiency

Pending Publication Date: 2021-03-09
ZHEJIANG GEELY HLDG GRP CO LTD +1
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, the existing single-vehicle-based multi-lidar fusion schemes, such as lidars installed on the roof and front and rear of the vehicle, cannot detect distant or occluded targets, nor can they deal with sudden appearance of targets. emergency scenarios; in addition, although there are solutions for roadside installation of lidar in the existing technology, and with the maturity of 5G technology and road infrastructure, the development of technologies such as vehicle networking V2X and vehicle-road coordination has been accelerated.
The communication and interaction between vehicles and between vehicles and roads can be realized through cellular network or WIFI technology. However, in the existing technology, the target data detected by the road-end lidar and the vehicle-end lidar exist independently, which cannot be given People provide an accurate and complete target data, but there is a problem of low accuracy and effectiveness of the perception of targets and obstacles

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-laser radar fusion method based on vehicle-road cooperation
  • Multi-laser radar fusion method based on vehicle-road cooperation
  • Multi-laser radar fusion method based on vehicle-road cooperation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065] The following are specific embodiments of the present invention and in conjunction with the accompanying drawings, the technical solutions of the present invention are further described, but the present invention is not limited to these embodiments.

[0066] Such as figure 1 As shown, this multi-lidar fusion method based on vehicle-road coordination includes the following steps:

[0067] Step 1. Process the target information detected by the laser radars installed on the roadside and the vehicle, and then obtain the point cloud data corresponding to the laser radar; the specific point cloud processing steps are: binary package analysis, through the point cloud The library software converts and analyzes the original point cloud data of the target information into point cloud data in PCD format, and the point cloud data includes the spatial three-dimensional position of the laser radar reflection point cloud;

[0068] Delete the point cloud data of the non-interest area ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a multi-laser radar fusion method based on vehicle-road cooperation, belonging to the technical field of automobiles. The invention aims to solve the problem that accurate and complete target data cannot be provided for people in the prior art. The multi-laser radar fusion method based on vehicle-road cooperation comprises the following steps: performing point cloud processing on target information detected by laser radars installed at a road end and a vehicle end to obtain point cloud data corresponding to the laser radars; conducting clustering, tracking and space synchronization processing on the point cloud data in sequence to form target lists corresponding to the laser radars; performing time synchronization processing on each target list to obtain each targetlist at the same moment; extracting all the point cloud data corresponding to the same target in each target list at the same moment; and performing clustering and tracking processing on all the extracted point cloud data in sequence, and then performing fusion to form a final target list. According to the invention, sensing information with higher precision and better stability can be obtained.

Description

technical field [0001] The invention belongs to the technical field of automobiles, and relates to a multi-laser radar fusion method based on vehicle-road coordination. Background technique [0002] In recent years, smart cars (or unmanned cars, self-driving cars) have become a research hotspot in the field of automotive engineering in the world and a new driving force for the growth of the automotive industry. [0003] Perception fusion is a research hotspot in the field of intelligent driving, and a large number of new fusion technologies have been proposed. For example, in functions such as automatic parking assistance system and automatic valet parking, the fusion of surround view camera, ultrasonic radar, millimeter wave radar and other information can help the car complete obstacle avoidance, find parking spaces and other actions. In the side functions such as blind spot detection, door opening warning, and change to warning, through the fusion of various sensors such...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01S17/87G01S17/931G01S7/48G06K9/62
CPCG01S17/87G01S17/931G01S7/4802G06F18/23
Inventor 余舟邓堃张军陈文琳
Owner ZHEJIANG GEELY HLDG GRP CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products