Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Method and system for generating attention remote sensing image description based on high-low layer feature fusion

A remote sensing image and feature fusion technology, applied in biological neural network models, instruments, electrical digital data processing, etc., can solve problems such as the inability to describe input remote sensing images well, and achieve accurate understanding.

Active Publication Date: 2020-10-30
AEROSPACE INFORMATION RES INST CAS
View PDF15 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] In order to solve the problem that the input remote sensing image cannot be well described in the prior art, the present invention provides a method for generating the attention remote sensing image description of high-level and low-level feature fusion, including:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for generating attention remote sensing image description based on high-low layer feature fusion
  • Method and system for generating attention remote sensing image description based on high-low layer feature fusion
  • Method and system for generating attention remote sensing image description based on high-low layer feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0054] Embodiment 1: a kind of attention remote sensing image description generation method of high and low layer feature fusion, such as figure 1 Shown: includes:

[0055] Step 1: Obtain the remote sensing image to be tested;

[0056] Step 2: Obtain a natural language sentence description of the remote sensing image to be tested based on the remote sensing image to be tested and a pre-trained image description model;

[0057] Among them, the image description model is constructed by using the encoder constructed by the convolutional network, the attention of the fusion of high-level and low-level features, and the decoder constructed by the loop-recursive network.

[0058] Step 2: Obtain a natural language sentence description of the remote sensing image to be tested based on the remote sensing image to be tested and a pre-trained image description model:

[0059] An embodiment of the present invention provides an attention remote sensing image description generation method...

Embodiment 2

[0101] The present invention based on the same inventive concept also provides a generation system for attention remote sensing image description of high-level and low-level feature fusion, including:

[0102] A data acquisition module, configured to acquire remote sensing images to be measured;

[0103] A language generation module that obtains a natural language sentence description of the remote sensing image to be tested based on the remote sensing image to be tested and a pre-trained image description model;

[0104] Wherein, the training of the image description model includes: training the encoder and the decoder based on the remote sensing image and the natural language sentence description information corresponding to the remote sensing image.

[0105] Preferably, the language generation module includes:

[0106] The feature extraction sub-module extracts features of the remote sensing image to be measured based on a pre-trained encoder, obtains the global semantic f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method and a system for generating attention remote sensing image description based on high-low layer feature fusion. The method comprises the following steps: acquiring a to-be-detected remote sensing image; and obtaining natural language sentence description of the to-be-detected remote sensing image based on the to-be-detected remote sensing image and a pre-trained image description model, wherein the image description model is constructed by an encoder constructed by using a convolutional network, attention of high and low layer feature fusion and a decoder constructed by using a cyclic recursive network. According to the technical scheme provided by the invention, local detail information of image shallow features and global semantic information of high-levelfeatures are fully utilized, and global and local double consciousness is given to a traditional attention mechanism, so that semantic content expressed by the image and the mutual relation between the semantic content and the natural language sentence are understood more accurately, and the natural language sentence description with accurate content is generated for the remote sensing image.

Description

technical field [0001] The invention relates to the field of image description, in particular to a method and system for generating attention remote sensing image description by fusion of high-level and low-level features. Background technique [0002] Remote Sensing Description Generation (RSDG) is an important part of the field of remote sensing images. The main problem to be solved is to understand the semantic content of remote sensing images, and then generate natural language sentence descriptions for remote sensing images. Therefore, one of the primary problems to be solved in the generation of remote sensing image description is the understanding of remote sensing image semantics, which helps machines understand the way human vision captures image features; secondly, compared to other problems in the field of remote sensing images, such as scene classification (Scene Classification) , object detection (Object Detection), semantic segmentation (Semantic Segmentation) ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06K9/46G06N3/04G06F40/216
CPCG06N3/049G06F40/216G06V20/13G06V10/40G06N3/045G06F18/213G06F18/253G06F18/214
Inventor 张文凯孙显许光銮张政远李轩汪勇刘文杰
Owner AEROSPACE INFORMATION RES INST CAS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products