Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Semantic segmentation network based on optical and PolSAR feature fusion

A technology of semantic segmentation and feature fusion, which is applied in biological neural network models, neural learning methods, character and pattern recognition, etc., can solve the problems that remote sensing images cannot be effectively integrated and applied, so as to improve the effect, capture and restore spatio-temporal information Effect

Active Publication Date: 2022-04-22
NO 54 INST OF CHINA ELECTRONICS SCI & TECH GRP
View PDF11 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to overcome the problem that the current multi-modal remote sensing image cannot be effectively fused and applied, and proposes a semantic segmentation network ASCAFNet based on the fusion of optical and PolSAR features, and realizes the useful information in PolSAR and optical images through the ASCAFNet neural network. Complementary to improve the accuracy and reliability of ground object segmentation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Semantic segmentation network based on optical and PolSAR feature fusion
  • Semantic segmentation network based on optical and PolSAR feature fusion
  • Semantic segmentation network based on optical and PolSAR feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041]Specific embodiments of the present invention will be described below in conjunction with the accompanying drawings, so that those skilled in the art can better understand the present invention. It should be noted that in the following description, when detailed descriptions of known functions and designs may dilute the main content of the present invention, these descriptions will be omitted here.

[0042] The present invention designs a new deep learning model ASCAFNet (Atrous Spatial Channel Attention Fusion Networks), which is used to realize end-to-end optical and PolSAR fusion object segmentation tasks. ASCAFNet consists of three parts, two-way twin convolutional feature encoder, attention mechanism module ASCAM (AtrousSpatial Channel Attention module) and symmetric skip connection decoder. We first designed a two-way twin convolutional feature encoder, and used ImageNet and a large number of annotated PolSAR and light images to pre-train each encoder to maximize t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of intelligent application of remote sensing images, designs a semantic segmentation network based on optical and PolSAR feature fusion, and is used for realizing an end-to-end optical and PolSAR fusion ground feature segmentation task. The device is composed of a double-path twin convolution feature encoder, an attention mechanism module ASCAM and a symmetric jump connection decoder. Wherein the dual-path twin convolution feature encoder adopts ImageNet and a large number of labeled PolSAR and light images to carry out pre-training on each path of encoder. And then designing an ASCAM, obtaining a nonlinear influence relationship of light at a local position and each channel of the PolSAR on a classification task through an expansion space matrix, and combining the expansion space matrix with data of a double-channel twin convolution feature encoder to realize weighted guidance of attention on a feature fusion process. And finally, realizing fusion of weighted features through convolution, designing a symmetric jump connection decoder, performing jump connection on the fused features and coding process features in optical and PolSAR, and finally realizing end-to-end ground feature segmentation.

Description

technical field [0001] The invention belongs to the technical field of intelligent application of remote sensing images, and more specifically relates to the automatic semantic segmentation of ground object categories through the fusion of optical remote sensing images and PolSAR remote sensing images. Background technique [0002] Land cover segmentation and classification based on remote sensing imagery is a challenging problem due to its scale and material diversity. At present, many remote sensing image ground object segmentation tasks mainly use a single payload data. For example, many studies use visible light remote sensing images for ground object segmentation, but the effect is not good due to single spectral information and cloud cover. Some other studies use PolSAR images for ground object segmentation, but the poor image quality and high noise also lead to poor classification results. The complexity of many area types such as urban areas necessitates the use of ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V10/26G06V10/80G06V10/82G06N3/04G06N3/08G06K9/62
CPCG06N3/08G06N3/045G06F18/253
Inventor 楚博策裴新宇陈金勇陈杰高峰杨威王士成
Owner NO 54 INST OF CHINA ELECTRONICS SCI & TECH GRP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products