Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

3D target motion analysis method based on visual and radar information fusion

A technology of target motion analysis and radar information, applied in the field of 3D target motion analysis based on fusion of vision and radar information, can solve problems such as unreliability, poor performance, lack of RGB information, etc., and achieve accurate and scientific operation analysis and prediction, reliability The effect of high stability and excellent performance

Active Publication Date: 2019-09-17
HUNAN UNIV
View PDF5 Cites 50 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This patented technology uses both visible images from cameras or laser scanners together for accurate movement detection during flight simulation testing. By combining these two types of imagery into one, this system provides quick and precise results that are highly repeatable across different test scenarios without requiring significant modifications over time.

Problems solved by technology

The technical problem addressed by these inventions relates to improving the accuracy and reliability of detecting and navigating 3D obstacles such as cars or pebbles from images captured through cameras mounted at different angles around an axis.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • 3D target motion analysis method based on visual and radar information fusion
  • 3D target motion analysis method based on visual and radar information fusion
  • 3D target motion analysis method based on visual and radar information fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] like figure 1 Shown is a schematic flow chart of the method of the present invention: the 3D target motion analysis method based on vision and radar information fusion provided by the present invention includes the following steps:

[0046] S1. Construct the initial target detection model and train the model to obtain the target detection model; specifically, the initial target detection model uses the Mask-RCNN deep learning detection algorithm, and uses the public BDD100K automatic driving dataset to learn the Mask-RCNN deep learning The detection algorithm is trained;

[0047] S2. Acquiring camera images in real time;

[0048] S3. Using the target detection model obtained in step S1 to detect the camera image obtained in step S2, so as to obtain the 2D frame and target mask of the target;

[0049] S4. Use the multi-target tracking algorithm to track the target to obtain the target id; specifically, the following steps are used to track the target and obtain the tar...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a 3D target motion analysis method based on visual and radar information fusion. The method comprises: constructing and training an initial target detection model to obtain a target detection model; obtaining a camera image in real time; detecting the camera image to obtain a 2D frame and a target mask of the target; tracking the target by using a multi-target tracking algorithm to obtain a target id; acquiring laser radar point cloud data in real time; performing joint calibration on the camera image and the laser radar point cloud data to obtain a coordinate conversion relation; projecting the laser radar point cloud data to an image and obtaining point cloud data; filtering the point cloud data to obtain point cloud data only belonging to the target; performing 3D rectangular frame fitting to obtain 3D coordinates of the target; and calculating the speed and the speed direction of the target, and completing the motion analysis of the 3D target. According to the method, the 3D target can be rapidly, accurately and scientifically operated, analyzed and predicted, and the method is high in reliability, good in accuracy and excellent in performance.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Owner HUNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products