Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Interpretable CNN image classification model-based optical remote sensing image classification method

A classification model and optical remote sensing technology, applied in the field of image processing, can solve the problems of low efficiency in the training process and low accuracy of image classification, and achieve the effects of enhancing interpretability, improving classification accuracy, and reducing time-consuming

Pending Publication Date: 2020-06-26
XIDIAN UNIV
View PDF5 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The technical problem to be solved by the present invention is to provide an optical remote sensing image classification method based on an interpretable CNN image classification model to solve the problems of low training process efficiency and low image classification accuracy in the prior art. question

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Interpretable CNN image classification model-based optical remote sensing image classification method
  • Interpretable CNN image classification model-based optical remote sensing image classification method
  • Interpretable CNN image classification model-based optical remote sensing image classification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The invention provides a method for classifying optical remote sensing images based on an interpretable CNN image classification model, by building an interpretable CNN network and selecting training samples and test samples; retraining the convolutional neural network ResNet model; performing an interpretability test; using The trained model is finally tested on the test set. The invention can quickly achieve the required recognition rate, reduces the time consumption of the network training process, improves the accuracy rate of remote sensing image classification, and improves the interpretability of the neural network model.

[0029] see figure 1 , an optical remote sensing image classification method based on an interpretable CNN image classification model in the present invention, whether 34 should be deleted. In the process of classifying optical remote sensing images in the general convolutional neural network, the features obtained by downsampling lose a lot o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an interpretable CNN image classification model-based optical remote sensing image classification method. An explainable CNN ResNet model composed of six groups of 51 basic convolution layers, three full connection layers and one Softmax layer in total is built; the method comprises the following steps of: performing down-sampling by utilizing ResNet; obtaining a feature map containing context information; and then interpretable modification is performed on the ResNet model to obtain a new interpretable CNN network based on the ResNet model, and the ResNet model extracts features through multiple groups of convolution-pooling layers with residual modules and finally inputs the features to a full connection layer to classify the images. The interpretability of an existing deep learning model can be enhanced, and the performance of the model is further improved.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to an optical remote sensing image classification method based on an interpretable CNN (Convolutional Neural Networks) image classification model, which can be used for classification of ground objects such as planes and ships. Background technique [0002] Foreign top Internet companies such as Google and Microsoft, and domestic companies such as Baidu, Tencent and Alibaba have all increased their investment in the artificial intelligence industry. The practical application of the artificial intelligence industry has a subtle influence on people's way of life. In the research field of artificial intelligence, the research of deep learning is a major focus, which can be applied in various fields of artificial intelligence, such as speech signal processing, computer vision and natural language processing. processing) etc. CNNs (Convolution Nerual Networks) can ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V20/13G06N3/047G06N3/045G06F18/241G06F18/2415
Inventor 庞焱萌侯彪焦李成马文萍马晶晶杨淑媛
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products