Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Unmanned ship perception fusion algorithm based on deep learning

A technology of fusion algorithm and deep learning, applied in the field of unmanned boat perception fusion algorithm based on deep learning, can solve the problems of limited detection distance, short detection distance, long detection distance, etc., to reduce detection cost and enhance robustness. Effect

Active Publication Date: 2019-11-05
NAVAL UNIV OF ENG PLA
View PDF10 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

X-band radar and millimeter-wave radar are all-weather radars, which have the advantages of large detection range and long detection distance, and can provide good early warning and tracking for long-distance large objects; the advantage of laser radar is its high detection accuracy, but multi-line laser Radar has a high price, and its working performance is not stable, and it is easily affected by factors such as weather and visibility; the biggest advantage of visible light cameras is that it can identify objects and obtain a high-level understanding of the environment, but like lidar, it is affected by The influence of the environment is greater, and as the visibility decreases, its measurement range, measurement distance, and measurement accuracy will decrease; infrared detection can use temperature differences to detect targets under the condition of reduced visibility, but its accuracy is also affected by the environment. impact, and the detection distance is relatively short; ultrasonic detection can adapt to all kinds of bad weather, the disadvantage is that the detection distance is very limited

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unmanned ship perception fusion algorithm based on deep learning
  • Unmanned ship perception fusion algorithm based on deep learning
  • Unmanned ship perception fusion algorithm based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] The technical solutions in the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the present invention.

[0052] figure 1 Shown is the data flow chart of the fusion system used by the deep learning-based unmanned vehicle perception fusion algorithm of the present invention, and the functions of each module are introduced as follows:

[0053] ① Image processing module: through this module, the data of the camera is obtained, and the image is converted into an appropriate size through filtering, size conversion, etc. for subsequent modules.

[0054] ②Deep convolutional network module: The data processed by the image processing module is used as input, and the pre-trained model is used to detect the target.

[0055] ③Visual upper and lower frame association module: mainly use CAMShift to assist the deep convolutional network to track the target, solve the problem that the lightweight network is easy to lose ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an unmanned ship perception fusion algorithm based on deep learning, and the algorithm comprises the steps: 1, carrying out target detection of a single-frame image collected bya camera, and obtaining machine vision data; 2, filtering and clustering the data input by the laser radar, and clustering discrete points into target information to obtain radar data; 3, performingdata association on the radar data and the machine vision data; after the step 1 and the step 2 respectively process the data of the camera and the radar, matching observation targets of the radar data and the machine vision data, establishing a data association model of the radar data and the machine vision data to obtain an association matrix of the radar and the vision measurement target, and screening out a target with a relatively high association degree through the association matrix of the radar and the vision measurement target so as to realize sensing of the target. According to the method, the advantages of the radar sensor and the visual sensor can be fused, enough target information is provided for unmanned ship environment perception, and the method has certain robustness andcan adapt to certain interference.

Description

technical field [0001] The invention relates to the field of perception fusion of unmanned boats, in particular to a deep learning-based algorithm for perception fusion of unmanned boats. Background technique [0002] Unmanned Surface Vehicle (USV) is a kind of surface unmanned system that has autonomous capabilities and can perform specific tasks. Unmanned boats play an important role in both military and civilian applications. With future conflicts largely taking place in littoral regions of the globe, it may no longer be wise to put soldiers at risk against an enemy with increasingly effective weapon systems. USVs will provide the Navy with additional combat power, especially in situations where loss of life is unacceptable. USVs can be deployed in waters unacceptable to manned ships, including high-risk environments or areas contaminated by nuclear, biological or chemical agents. They are reliable, fast, highly maneuverable, and can perform a variety of missions, incl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V20/52G06N3/045G06F18/251G06F18/241
Inventor 尹洋桂凡王征陈帅李洪科王黎明卜乐平刘小虎王家林
Owner NAVAL UNIV OF ENG PLA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products