Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Target detection and motion state estimation method based on vision and laser radar

A technology of laser radar and motion state, which is applied in the field of smart car environment perception, can solve the problems of laser radar not having visual recognition, low precision, and limited detection distance of objects

Active Publication Date: 2020-11-17
CHONGQING UNIV OF POSTS & TELECOMM
View PDF7 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For example, lidar has the advantages of long detection distance and high detection accuracy, but lidar does not have the function of visual recognition; while visual sensor makes up for the disadvantage of lidar in identifying objects, it can capture the details of objects, such as brightness, texture, etc. features, but the visual sensor is affected by the weather conditions, and can only detect and identify targets in good weather, and the detection distance of objects is limited, and the accuracy is not high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target detection and motion state estimation method based on vision and laser radar
  • Target detection and motion state estimation method based on vision and laser radar
  • Target detection and motion state estimation method based on vision and laser radar

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] The technical solutions in the embodiments of the present invention will be described clearly and in detail below with reference to the drawings in the embodiments of the present invention. The described embodiments are only some of the embodiments of the invention.

[0057] The technical scheme that the present invention solves the problems of the technologies described above is:

[0058] The object of the present invention is to provide a kind of target detection and motion state estimation algorithm based on vision and laser radar under the complex road conditions of mountainous areas such as vehicles and pedestrians, through 8 cameras (wherein 2 cameras on the front side of the car, 2 cameras on the right side of the car, 2 cameras on the left side of the car, 2 cameras on the rear side of the car) and a lidar. Using the principle of stereo vision and the fusion technology of laser radar to solve the target detection and target movement state analysis of complex ro...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target detection and motion state estimation method based on vision and laser radar, and the method comprises the following steps: 1, installing the positions of a camera andthe laser radar, carrying out the calibration between every two binocular cameras, and carrying out the combined calibration between a left camera in front of a vehicle and the laser radar; fusing and supplementing the sparse point clouds respectively generated by the stereoscopic vision system and the laser radar to form a global three-dimensional point cloud graph; 2, detecting the two-dimensional image and the three-dimensional point cloud by using a deep learning method, and fusing an image target three-dimensional motion state generated by stereoscopic vision with a laser radar three-dimensional point cloud target to obtain comprehensive target three-dimensional feature points; and 3, tracking the target by utilizing Kalman filtering according to the three-dimensional feature pointsof the target, and analyzing the motion state of the target. The key of the method lies in the fusion of vision and laser radar sensors, and improves the sensing capability of an intelligent automobile for the surrounding environment.

Description

technical field [0001] The invention belongs to the field of intelligent vehicle environment perception, and specifically designs a method for detecting vehicles, pedestrians and other targets and estimating motion states based on vision and laser radar under complex road conditions in mountainous areas. Background technique [0002] With the rapid development of artificial intelligence, machine vision and other fields in recent years, smart cars have become an important field of research and development in academia and industry. For smart cars, the ability to sense and understand the surrounding environment in real time cannot be missed. Only when the environment perception technology is accurate, real-time, and reliable, can the vehicle plan the correct traffic path, and then realize the safe automatic driving. Therefore, the environment perception technology is the basic requirement and prerequisite for the safe driving of smart cars. [0003] Environmental perception te...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/277G06T7/292G06T7/33G06T7/80G06T5/00G06N3/08G06K9/62G06K9/32G01S17/931
CPCG06T7/277G01S17/931G06T7/85G06T7/337G06T7/292G06N3/08G06T2207/10028G06T2207/20081G06V10/25G06V2201/07G06F18/23G06T5/80G06T5/70
Inventor 高小倩冯明驰冯辉宗岑明王字朋卜川夏
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products