Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Remote sensing scene classification method and device, terminal equipment and storage medium

A scene classification and remote sensing technology, applied in the field of remote sensing images, can solve the problem of shallow convolution feature loss

Pending Publication Date: 2020-10-30
SOUTH CENTRAL UNIVERSITY FOR NATIONALITIES
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The main purpose of the present invention is to provide a remote sensing scene classification method, device, terminal equipment and storage medium, aiming to solve the technical problem of how to reduce the loss of shallow convolution feature information when remote sensing scene classification

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote sensing scene classification method and device, terminal equipment and storage medium
  • Remote sensing scene classification method and device, terminal equipment and storage medium
  • Remote sensing scene classification method and device, terminal equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0098] refer to Figure 4 , Figure 4 It is a schematic flowchart of the third embodiment of a remote sensing scene classification method of the present invention. Based on the first embodiment above, the remote sensing scene classification method in this embodiment specifically includes in step S50:

[0099] Step S51: Perform feature merging of the target convolutional features and the global features output by the preset convolutional neural network model to obtain target classification features.

[0100] It is easy to understand that the remote sensing image set is input into the preset convolutional neural network model, so as to obtain the output global features. The global feature is merged with the target convolutional feature, ie the global feature is compensated by the target convolutional feature.

[0101] Step S52: Obtain a feature vector of the object classification feature, and obtain the number of object categories according to the feature vector.

[0102] In...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of remote sensing images, and discloses a remote sensing scene classification method and device, terminal equipment and a storage medium. The method comprises the steps of obtaining a remote sensing scene image set, and inputting the remote sensing scene image set into a preset convolutional neural network model for feature extraction to obtain a top semantic feature set and a shallow appearance feature set; performing feature aggregation on the top semantic feature set through dense connection to obtain a first convolution feature; performing feature aggregation on the shallow appearance feature set to obtain a second convolution feature; performing feature compensation on the first convolution feature and the second convolution feature throughbidirectional gating connection to obtain a target convolution feature; and classifying the remote sensing scene images in the remote sensing scene image set according to the target convolution features. Characteristic aggregation is utilized, and shallow appearance characteristics and top semantic characteristics are complemented, so that shallow convolution characteristic information loss in aclassification characteristic aggregation stage is prevented.

Description

technical field [0001] The present invention relates to the technical field of remote sensing images, in particular to a remote sensing scene classification method, device, terminal equipment and storage medium. Background technique [0002] Since RS (Remote Sensing, Remote Sensing) scenes contain complex different types of land cover, classifying RS scenes is a difficult task, focusing on designing various human-engineered features using extensive engineering skills and domain expertise, For example: color, texture, shape, spatial, spectral information or combinations thereof. RS scene classification often has the situation that different scenes can be better distinguished by spectrum, shape or texture. At present, most works use fine-tuned pre-trained CNN (Convolutional Neural Networks, Convolutional Neural Networks) and aggregate convolutional features of different layers to classify complex remote sensing scenes. When aggregating the convolutional features of different...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V20/13G06N3/045G06F18/241G06F18/214
Inventor 宋中山梁家锐郑禄帖军刘振宇汪红周珊
Owner SOUTH CENTRAL UNIVERSITY FOR NATIONALITIES
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products