Feature extraction method of underwater target based on convolutional neural network

A convolutional neural network and underwater target technology, applied in the field of underwater target feature extraction, can solve problems such as displacement, loss of spatial information, and deformation of observed targets

Active Publication Date: 2021-04-20
HARBIN ENG UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The feature information in two-dimensional images is often highly concealed, because changes in the position of the observation target and different observation angles will cause deformation, displacement or even distortion of the observation target.
In this process, spatial information will be lost and spatial features cannot be restored in the SoftMax layer, which will affect the classification accuracy and indirectly affect the quality of feature extraction when the network is continuously fed back and adjusted.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feature extraction method of underwater target based on convolutional neural network
  • Feature extraction method of underwater target based on convolutional neural network
  • Feature extraction method of underwater target based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0090] The following examples describe the present invention in more detail.

[0091] The time-frequency domain conversion is performed on the original noise signal to generate a LoFAR spectrogram that can represent time-frequency domain information. The specific processing process is:

[0092] 1. Define S(n) as the sampling sequence of the original radiation noise signal, divide it into 25 continuous parts, and set 25 sampling points for each part. Among them, 25 consecutive parts are allowed to have overlapping parts of data, and the degree of crossover is set to 50%.

[0093] 2. Define M j (n) is the sampling sample of the j-th segment signal, and it is normalized and centered, the purpose of which is to make the amplitude of the radiation noise signal evenly distributed in time and achieve DC removal so that the mean value of the sample is zero.

[0094] Normalization processing:

[0095]

[0096] In order to facilitate the calculation of the Fourier transform, the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an underwater target feature extraction method based on a convolutional neural network. 1. Divide the sampling sequence of the original radiation noise signal into 25 consecutive parts, and set 25 sampling points for each part; 2. Normalize and centralize the sampling samples of the j-th segment data signal; perform short-term Fourier transform to get the LoFAR image; 4. Assign the vector to the existing 3-dimensional tensor; 5. Input the obtained feature vector to the fully connected layer for classification and calculate the error with the label data, and check whether the loss error is lower than the error threshold. If it is lower than that, stop the network training, otherwise go to step 6; 6. Use the gradient descent method to adjust the parameters of the network layer by layer from the back to the front, and go to step 2. The recognition rate of the method of the present invention is compared with the traditional convolutional neural network algorithm, and the multi-dimensional weighting operation of the spatial information is performed on the feature layer to make up for the defect of the loss of spatial information caused by the one-dimensional vectorization of the fully connected layer. .

Description

technical field [0001] The invention relates to an underwater target feature extraction method. Background technique [0002] At present, there are mainly two methods for underwater target feature extraction: time domain and frequency domain. The time domain is extracted from the waveform structure features. Reflected in the shape of the echo, the more obvious the target difference is, the more obvious the difference in the waveform structure is; in addition, the difference in the receiving angle of the echo and the attitude of the target will also have a greater impact on the waveform in the time domain, and these differences also hide the difference between the target. The characteristics of the object, that is, the classification features of the target are extracted from the waveform structure. Frequency domain features refer to the spectrum features obtained after signal processing. The target is identified by the method of spectrum estimation, and the target feature pa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V10/454G06N3/045G06F18/217
Inventor 王红滨何鸣宋奎勇周连科王念滨郎泽宇王瑛琦顾正浩李浩然迟熙翔
Owner HARBIN ENG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products