Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Feature fusion method based on stack-type self-encoder

A stacked self-encoding and feature fusion technology, which is applied in the field of SAE-based feature fusion, can solve the problems of insignificant improvement in feature discrimination, complex fusion network structure, and inability to meet real-time performance, so as to reduce network training time and testing time , simplify the SAE structure, and improve the effect of fusion efficiency

Active Publication Date: 2017-06-20
NAT UNIV OF DEFENSE TECH
View PDF2 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Existing SAE-based feature fusion algorithm (SAE-based feature fusion algorithm can refer to the literature Chen Y, LinZ, Zhao X, et al..Deep Learning-Based Classification of Hyperspectral Data[J].IEEE Journal of Selected Topics in Applied Earth Observations&RemoteSensing,2014,7(6):2094-2107.), the selected feature dimension is high, the fusion network structure is complex, the training time is long, and it cannot meet the real-time requirements
In addition, the redundancy between the features is large, and the complementarity is small, and the feature discrimination is not significantly improved after fusion.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feature fusion method based on stack-type self-encoder
  • Feature fusion method based on stack-type self-encoder
  • Feature fusion method based on stack-type self-encoder

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] The experimental data of the present invention is the MSTAR data set, which includes BMP2, BRDM2, BTR60, BTR70, D7, T62, T72, ZIL131, ZSU234, 2S1 and other 10 types of military targets such as SAR (synthetic aperture radar, synthetic aperture radar) image slices , figure 1 An example of slices of 10 types of military targets is given in , and the slice size is uniformly cropped to 128×128 pixels.

[0017] figure 2 Be the flow chart of the present invention, in conjunction with a certain experiment of the present invention, concrete implementation steps are as follows:

[0018] The first step is to extract the TPLBP (Three-Patch Local Binary Patterns, local three-patch binary pattern) texture feature of the image. The LBP (Local Binary Patterns) operator is used to obtain the LBP code value of the original image, and then the TPLBP code value is obtained by comparing the LBP value between image blocks, and the histogram vector is obtained by counting the TPLBP code va...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a feature fusion method based on a stack-type self-encoder. The technical scheme of the invention comprises the following steps: firstly extracting binary mode texture features of three local spots of an image, selecting and extracting a plurality of types of baseline features of the image through a feature selection method, carrying out the series connection of all obtained features, and obtaining a series vector; secondly carrying out the standardization and whitening of the series vector; enabling a whitened result to serve as the input of SAE, and carrying out the training of the SAE through employing a layer-by-layer greedy training method; finally carrying out the fine tuning of the SAE through the trained SAE and a softmax filter, and enabling a loss function to be minimum, wherein the output of the SAE is a fusion feature vector which is high in discrimination performance. The selected feature redundancy is small, and more information is provided for feature fusion.

Description

technical field [0001] The invention belongs to the technical field of image fusion, and relates to a feature fusion method based on SAE (Stacked Autoencoder, stacked autoencoder), which improves the discrimination and efficiency of fusion features. Background technique [0002] Feature fusion refers to the technology of comprehensive analysis and fusion processing of the extracted feature information. In image understanding, the use of feature fusion can not only increase the feature information of the image, but also effectively integrate the advantages of the original features to obtain a more comprehensive feature expression of the target. Classical feature fusion algorithm (feature fusion algorithm can refer to Wang Dawei, Chen Dingrong, He Yizheng. A review of multi-feature image fusion technology for target recognition[J]. Avionics Technology, 2011,42(2):6-12.), directly Combining features directly in a certain way does not essentially consider the influence of the r...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/253
Inventor 计科峰康妙冷祥光邹焕新雷琳孙浩李智勇周石琳
Owner NAT UNIV OF DEFENSE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products