Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Land cover classification method based on deep fusion of multi-modal remote sensing data

A technology of remote sensing data and land surface coverage, applied in the field of remote sensing, can solve problems such as more exploration in the field of depth feature fusion, and achieve the effect of broad application prospects

Pending Publication Date: 2021-10-01
上海中科辰新卫星技术有限公司
View PDF0 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, none of the existing methods has done more exploration in the field of deep feature fusion. For example, the densely connected transformation network (DCST) uses the transformer as a feature extractor to try to replace the encoder structure in the classic convolutional network. ; For example, U-shaped network-based transformation network (TransUNet) regards transformer as a high-dimensional feature extraction module, and further processes the features extracted by classic encoders; such as U-shaped network-based pure attention transformation network (Swin-Unet) Following the U-shaped network, the entire network structure is used for feature extraction and type prediction by transformer
These methods regard the transformer as a feature extraction structure and do not utilize it in the field of feature fusion

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Land cover classification method based on deep fusion of multi-modal remote sensing data
  • Land cover classification method based on deep fusion of multi-modal remote sensing data
  • Land cover classification method based on deep fusion of multi-modal remote sensing data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] The present invention will be further described below in combination with specific embodiments. It should be understood that these examples are only used to illustrate the present invention and are not intended to limit the scope of the present invention. In addition, it should be understood that after reading the content taught by the present invention, those skilled in the art may make various changes or modifications to the present invention, and these equivalent forms also fall within the scope defined by the appended claims of the present application.

[0053] A land cover classification method based on deep fusion of multimodal remote sensing data, the steps are as follows:

[0054] (1) Build a model;

[0055] The model is a remote sensing image semantic segmentation network based on multimodal information fusion. The model structure is as follows: figure 1 As shown, it includes an encoder for extracting feature features, a deep feature fusion module, a spatial ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a land cover classification method based on deep fusion of multi-modal remote sensing data. The method comprises the following steps: (1) constructing a remote sensing image semantic segmentation network based on multi-modal information fusion, wherein the network comprises an encoder for extracting ground feature features, a depth feature fusion module, a spatial pyramid module and an up-sampling decoder, and the depth feature fusion module comprises an ACF3 module and a CACF3 module which are used for simultaneously fusing three modal information of RGB, DSM and NDVI, the ACF3 module is a self-attention convolution fusion module based on transformer and convolution, and the CACF3 module is a cross-modal convolution fusion module based on transformer and convolution; (2) training the network constructed in the step (1); (3) utilizing the network model trained in the step (2) to predict the remote sensing image ground object category. Compared with a conventional method, the earth surface coverage classification method based on multi-modal remote sensing data deep fusion has the advantages that the precision improvement effect on earth surface classification tasks is remarkable, and the application prospect is wide.

Description

technical field [0001] The invention belongs to the technical field of remote sensing, and relates to a land cover classification method based on deep fusion of multimodal remote sensing data. Background technique [0002] The classification of surface objects (ground objects) is an important basis for the application of remote sensing image analysis. Nowadays, the continuous observation of the surface by multiple sensors has given birth to multi-scale, multi-temporal, multi-azimuth, and multi-level surface remote sensing images, which provide richer data information for the accurate description of ground objects. Since it is essentially the observation of the same ground object, although there is a certain gap between different modal information, there is still a complementary characteristic of information between multi-source remote sensing images. Therefore, using multiple remote sensing information sources to classify ground objects can achieve higher accuracy than sing...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/34G06K9/62G06N3/04
CPCG06N3/045G06F18/253
Inventor 曹金文枚金杨庆楠苏含坤
Owner 上海中科辰新卫星技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products