Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Remote sensing image scene classification method based on multi-scale depth feature fusion and transfer learning

A technology of remote sensing images and depth features, applied in neural learning methods, character and pattern recognition, instruments, etc., can solve the problem of single classification features of CNN models, achieve the effect of enhancing distinguishability, reducing computing efficiency and storage space

Active Publication Date: 2019-12-10
BEIJING UNIV OF TECH
View PDF3 Cites 45 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

First, imitating the characteristics of the human visual system to obtain remote sensing image features at different scales, the present invention uses the Gaussian pyramid algorithm to obtain multi-scale remote sensing images, and removes the last three fully connected layers of VGG16-Net as a fully convolutional neural network. The scale remote sensing image is used as the input of the full convolutional neural network to obtain multi-scale local features; secondly, in order to solve the problem that the CNN model is used for single classification features, the present invention cuts the dataset image to the fixed size required by VGG16-Net 224 ×224, input to the network to obtain global features, and then use the compact bilinear pooling operation to fuse the multi-scale deep local features obtained before and the global features obtained by CNN to obtain more distinguishing features; finally, for labeled remote sensing For the lack of image data sets, the present invention uses transfer learning strategies to transfer knowledge from marked large-scale remote sensing image data sets, perform pre-training on VGG16-Net, transfer model parameters to the designed network for fine-tuning, and realize Classification of Remote Sensing Image Scenes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote sensing image scene classification method based on multi-scale depth feature fusion and transfer learning
  • Remote sensing image scene classification method based on multi-scale depth feature fusion and transfer learning
  • Remote sensing image scene classification method based on multi-scale depth feature fusion and transfer learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] According to the above description, the following is a specific implementation process, but the protection scope of this patent is not limited to this implementation process.

[0020] Step 1: Acquisition of multi-scale depth local features

[0021] Step 1.1: Generation of multi-scale remote sensing images

[0022] The present invention adopts the Gaussian pyramid algorithm to form multi-scale remote sensing images through Gaussian kernel convolution and down-sampling. Scale image, in which the size of the upper layer image is a quarter of the next layer image, and the obtained multi-scale image is input into the VGG16-Net with three fully connected layers removed to obtain the local features of the multi-scale image, so that The network can learn the features of different scales of the same image, which is conducive to the correct classification of remote sensing image scenes.

[0023] Let the remote sensing image data set be I={I 1 , I 2 ,...,I K}, K is the number...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a remote sensing image scene classification method based on multi-scale depth feature fusion and transfer learning in order to solve the classification problem of remote sensingimage scenes. The method comprises the following steps: firstly, obtaining a multi-scale remote sensing image by using a Gaussian pyramid algorithm, inputting the multi-scale remote sensing image into a full convolutional neural network, and extracting multi-scale depth local features; cutting the image to a fixed size required by the CNN; obtaining global features of a full connection layer in an input network, using compact bilinear pooling operation to encode multi-scale depth local features and global features obtained by a CNN, fusing the two depth features to jointly represent a remotesensing image, enhancing the mutual relation between the features, and enabling the obtained features to be more distinctive; and finally, classifying the remote sensing image scenes by utilizing a transfer learning technology and combining the two methods. According to the convolutional neural network, a VGG16-Net is used as a basic network.

Description

technical field [0001] Aiming at the classification problem of remote sensing image scenes, the present invention proposes a remote sensing image scene classification method based on multi-scale depth feature fusion and transfer learning. The present invention first uses the Gaussian pyramid algorithm to obtain multi-scale remote sensing images, input them into the full convolutional neural network, and extract multi-scale depth local features; then cut the image to the fixed size required by CNN, and input it into the network to obtain the fully connected layer Global features, using compact bilinear pooling operations to encode multi-scale deep local features and global features obtained by CNN, represent remote sensing images by fusing the two depth features, enhance the relationship between features, and make the obtained features more distinguishable Finally, the transfer learning technology is used to classify the remote sensing image scene by combining the above two met...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V10/464G06N3/045G06F18/253G06F18/214
Inventor 张菁赵晓蕾卓力田吉淼
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products