Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Urban remote sensing image vegetation coverage identification method based on deep learning

A technology of vegetation coverage and remote sensing images, applied in the field of remote sensing recognition, can solve problems such as the limitation of recognition methods, complex building facilities and components, etc., and achieve the effect of simplifying processing content, improving processing efficiency and accuracy, and improving recognition accuracy

Pending Publication Date: 2022-05-24
SHANDONG INSPUR SCI RES INST CO LTD
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Urban vegetation greening is an important part of the urban ecological management system. Applying remote sensing technology to information extraction and data analysis of vegetation coverage has become an important applied research direction. Due to the complexity of urban building facilities and components, traditional identification methods are limited. On the other hand, if only remote sensing technology is used to identify vegetation coverage, there will be certain limitations

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Urban remote sensing image vegetation coverage identification method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] In step d), the urban vegetation coverage recognition model CPC-Det is composed of the remote sensing image vegetation coverage recognition model RS-PDet, the social media resource vegetation coverage recognition model Social-PDet, the map resource vegetation coverage recognition model Map-PDet, and the video resource vegetation coverage recognition model Video-PDet and environmental monitoring vegetation cover recognition model Iot-Det and fusion recognition model PFusion are composed.

Embodiment 2

[0055] The establishment of the RS-PDet vegetation cover identification model in remote sensing images includes the following steps:

[0056] 1.1) Collect urban remote sensing image data and mark vegetation coverage areas;

[0057] 1.2) Select the Faster R-CNN model or the SSD model or the YOLO model as the general overhead view vegetation recognition model, and the selection of the general overhead view vegetation identification model involves the CNN vegetation image recognition module RS-CNN;

[0058] 1.3) According to the input image size requirements of the CNN vegetation image recognition module RS-CNN, the remote sensing image data is cut with road information and river information as signs;

[0059] 1.4) Train the RS-CNN model of the CNN vegetation image recognition module;

[0060] 1.5) Based on the vegetation index NDVI, the vegetation is extracted by fusing spectral and texture features, and the remote sensing image analysis module RS-ANL is obtained to analyze and...

Embodiment 3

[0064] The establishment of the social media resource vegetation cover identification model Social-PDet includes the following steps:

[0065] 2.1) Collect social media data, and label semantic text and images related to plant coverage;

[0066] 2.2) Use the natural language processing semantic BERT language model to set the semantic keywords of vegetation cover, and perform training based on the collected social media data to obtain a semantic recognition model of vegetation cover text;

[0067] 2.3) Using the image target detection SSD model to operate the gradient descent method based on social media images to obtain a vegetation cover image detection model;

[0068] 2.4) Integrate the text and image of the same social media record, and use the gradient descent method based on the vegetation cover text semantic recognition model and the vegetation cover image detection model to obtain the social media record vegetation cover recognition model;

[0069] 2.5) Input several p...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An urban remote sensing image vegetation coverage identification method based on deep learning fully considers remote sensing image spectral feature characteristics and urban vegetation coverage characteristics, utilizes internet online resources to the greatest extent through fusion of multiple technologies such as big data, remote sensing technology and deep learning, and realizes urban remote sensing image vegetation coverage identification based on remote sensing images. The urban vegetation coverage recognition model is designed by combining internet resources such as social media, video aerial photography, map data, public monitoring videos and data collected by an environment monitoring device, accurate recognition of urban vegetation coverage is achieved, the internet resources are reasonably utilized, data contributed by social media collective are comprehensively analyzed, and the social media collective contribution data are comprehensively analyzed. And multiple data sources are fused, so that the recognition accuracy is improved, and powerful support is provided for urban vegetation investigation and more reasonable and effective ecological planning and construction. In addition, additional information such as roads, place data and interest points of the online map is utilized, and the processing content of urban vegetation coverage recognition is simplified.

Description

technical field [0001] The invention relates to the technical field of remote sensing identification, in particular to a deep learning-based urban remote sensing image vegetation coverage identification method. Background technique [0002] With the rapid development of deep learning technology and the support of massive data and efficient computing power in the Internet and cloud computing era, deep learning technology represented by CNN convolutional neural network can be trained to build a large-scale neural network similar to the structure of the human brain. , has made breakthroughs in the fields of computer vision, speech recognition, natural language understanding, etc., and is bringing subversive changes to the whole society. [0003] In recent years, remote sensing technology has been widely used. Multispectral images and panchromatic images obtained by satellites can be used to form remote sensing images with higher spatial resolution and spectral resolution throug...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V10/26G06V10/774G06K9/62G06N3/04G06V20/10
CPCG06N3/045G06F18/214Y02A30/60
Inventor 孙善宝
Owner SHANDONG INSPUR SCI RES INST CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products