A 3D object perception method in vehicle edge scene

A 3D target and scene technology, applied in the field of 3D target perception, can solve the problems of poor generalization and easy loss of target information, and achieve the effects of long time-consuming solution, reduced point cloud processing time, and improved generalization

Active Publication Date: 2022-08-02
GUANGDONG UNIV OF TECH
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The purpose of the present invention is to provide a three-dimensional object perception method in the vehicle edge scene, to solve the problems of poor generalization and easy loss of target information in the existing traditional point cloud perception method, and to improve real-time detection and tracking accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A 3D object perception method in vehicle edge scene
  • A 3D object perception method in vehicle edge scene
  • A 3D object perception method in vehicle edge scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043]The invention provides a three-dimensional target perception method in a vehicle-mounted edge scene, which realizes the three-dimensional target perception and tracking under the vehicle-mounted system by utilizing point cloud projection and two-dimensional image fusion. Under the algorithm optimization of parallel computing, the method performs filtering and segmentation operations on the point cloud image data, then performs point cloud classification and feature value extraction, and then combines the two-dimensional image to project the point cloud onto the two-dimensional image for clustering, and finally The matching of information points and the connection of targets are combined with the relevant data of the previous and subsequent frames to achieve the effect of matching and tracking. The method also solves the problem of how to perform target recognition through the combination of lidar and camera return images and deploy it on a small terminal device. After the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a three-dimensional target perception method in a vehicle edge scene, which utilizes point cloud projection and two-dimensional image fusion to realize the three-dimensional target perception and tracking under the vehicle system. The cloud image data is filtered and segmented, then point cloud classification and feature value extraction are performed, and then combined with the two-dimensional image, the point cloud is projected to the two-dimensional image for clustering, and finally the information points are matched and the target is combined with the related data of the previous and previous frames. contact to achieve the effect of matching tracking. The method also solves the problem of how to perform target recognition through the combination of lidar and camera return images and deploy it on a small terminal device. After the method of the present invention is applied to the vehicle-mounted device, the accurate recognition and tracking effect is obtained, and the advantages are relatively high. High generalization and real-time performance.

Description

technical field [0001] The invention relates to the vehicle-mounted field of intelligent identification and multi-sensor fusion, in particular to a three-dimensional target perception method in a vehicle-mounted edge scene. Background technique [0002] In recent years, with the continuous growth of car ownership, the carrying capacity of roads in many cities has reached full capacity, and traffic safety, travel efficiency, energy conservation and emission reduction have become increasingly prominent. The intelligent and networked vehicles are usually considered to solve the above traffic problems. important way. [0003] As the development of artificial intelligence and computer vision becomes more and more mature, the demand for vision tasks such as object detection and object tracking has increased dramatically in many practical applications on the perception layer or sensors of the Internet of Vehicles architecture. At the same time, the research on target detection tec...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/246G06T7/292G06T7/143G06T7/80G06T1/20G06V20/56G06V10/25G06V10/762G06V10/764G06V10/82G06V10/62G06K9/62
CPCG06T7/248G06T7/292G06T7/143G06T7/80G06T1/20G06T2207/10016G06T2207/10028G06T2207/20084G06T2207/30252G06F18/23G06F18/24
Inventor 黄泽茵钟卓柔余荣谭北海黄梓欣李贺全芷莹
Owner GUANGDONG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products