Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A full automatic extraction technology of high spatial resolution cultivated land based on depth learning

A technology of deep learning and high space, applied in the direction of instruments, biological neural network models, character and pattern recognition, etc., can solve the problems of high cost, achieve cost saving, high efficiency, and realize the effect of fine extraction

Active Publication Date: 2019-02-12
SUZHOU ZHONGKE IMAGE SKY REMOTE SENSING TECH CO LTD +3
View PDF2 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the current situation of high cost in the existing plot extraction methods and fully mine the edge information in the image, the present invention proposes a fully automatic extraction technology for high spatial resolution remote sensing cultivated land plots based on edge extraction. Learning is a means to improve the HED learning model through a large amount of edge label data, and then extract the edge of the target image, and refine the edge through the convolution calculation result of the canny edge operator, and finally realize the recognition of cultivated land

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A full automatic extraction technology of high spatial resolution cultivated land based on depth learning
  • A full automatic extraction technology of high spatial resolution cultivated land based on depth learning
  • A full automatic extraction technology of high spatial resolution cultivated land based on depth learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] figure 1 The main realization idea of ​​the present invention is illustrated, wherein the key technical part includes the automatic retrieval of edge label samples, the training of the deep learning model (HED) and the edge post-processing assisted by the Canny edge operator extraction boundary, and the post-processing includes edge processing Extracted accuracy verification.

[0024] The specific steps are as follows:

[0025] 1) Collect and sort out the images of the research area, establish an image database, and use the sample labels prepared in the previous stage to establish a farmland edge sample database;

[0026] 2) Use the Canny edge operator to extract the boundary of the remote sensing image of the study area;

[0027] 3) Based on the selected edge label sample data and corresponding image data as constraints, the HED model is improved, including the number of network layers and pooling size;

[0028] 4) Use the improved HED model to extract the boundary ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a full automatic extraction technology of high spatial resolution cultivated land based on depth learning. The technology of the present invention is guided by the edge extracted by the traditional Canny edge operator and is based on the depth learning theory, trains the HED depth learning model through a large number of edge sample label data, and improves the network layer number, pool size and other parameters of the model. And then the new network model is used to extract the cultivated land image, and finally, the boundary extracted by Canny edge operator is refined and eliminated to achieve the extraction of the skeleton edge of cultivated land. Compared with the traditional manual land extraction, the technology of the invention can effectively improve the production efficiency of the land extraction, and furthermore can ensure the unification of the land edge precision.

Description

technical field [0001] The present invention proposes a high-spatial-resolution remote sensing image full-automatic extraction technology of farmland plots, which is mainly applied to the extraction of cultivated land plots from high-spatial-resolution remote sensing images, and can improve the efficiency of manual extraction of plots. Background technique [0002] One of the important applications of remote sensing technology is the extraction of thematic information. One of the important links in the process of thematic information extraction is the segmentation of remote sensing images, and image segmentation involves the extraction of object edge information. Therefore, how to quickly and accurately extract object edge information is The key steps of remote sensing information processing. [0003] Traditional remote sensing image segmentation with good performance mainly adopts the bottom-up aggregation method. This strategy focuses on the extraction and use of features ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06N3/04
CPCG06V20/13G06V10/44G06N3/045
Inventor 夏列钢胡晓东周楠张明杰骆剑承郜丽静陈金律刘浩姚飞
Owner SUZHOU ZHONGKE IMAGE SKY REMOTE SENSING TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products