Multi-source data synergetic refined forest vegetation type information remote sensing updating method
A type of information, multi-source data technology, applied in the direction of radio wave measurement systems, instruments, etc., can solve problems that are difficult to overcome with a single data source, rough land cover and vegetation types, multi-source data and knowledge collaborative application, etc.
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0015] Taking Sichuan Province as an example, the specific implementation of the invention is described as follows:
[0016] 1. Data preparation: collect land use vector data in the later stage, and extract forest data (mainly including forest land, sparse forest land and shrub land); In each season, MODIS data of two scenes with little cloud were selected; multi-level forest vegetation type vector data were collected in the early stage, such as the multi-level forest vegetation type vector data extracted from the 1:1 million Chinese vegetation map. Combined with the phenological characteristics of the main ground objects in Sichuan Province, the MODIS data imaged on January 25 and February 26 in winter, April 6 and March 13 in spring, and August 28 and June 17 in summer were respectively selected.
[0017] 2. Geometric registration: Based on the later land use vector data, use the polynomial geometric correction model to perform geometric registration with the selected multi-...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com