Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Foundation cloud picture classification method based on heterogeneous feature fusion network

A heterogeneous feature fusion, ground-based cloud map technology, applied in the field of pattern recognition, can solve the problems of time-consuming, unreliable method, high cost of human eye observation, etc., to achieve strong robustness, improve generalization ability, and good recognition results. Effect

Active Publication Date: 2020-11-20
HOHAI UNIV
View PDF2 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Traditional cloud classification methods rely on expert experience, the method is unreliable, time-consuming, and to some extent depends on the operator's experience, the classification results usually have some uncertainty and bias
In addition, human eye observation has gradually become more costly

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Foundation cloud picture classification method based on heterogeneous feature fusion network
  • Foundation cloud picture classification method based on heterogeneous feature fusion network
  • Foundation cloud picture classification method based on heterogeneous feature fusion network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] Such as figure 1As shown, the ground-based cloud image classification method of the present invention is divided into two parts, which are the manual feature extraction process and the deep semantic feature extraction process. The manual feature extraction process first uses different preprocessing methods for images of different data sets, and then uses the spatial pyramid strategy to batch divide the images into local regions of different sizes, and then extracts SIFT, local binary mode, Structural texture features based on the features of the gray-level co-occurrence matrix and color features extracted based on the difference between the red and blue channels, and then transform the feature vectors into Fisher vectors to solve the problem of inconsistent feature lengths of a single image, and finally convert all parts of each image The feature vectors of the regions are concatenated to obtain the manual feature vector of each image, and the Fisher vector is put into ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a foundation cloud picture classification method based on a heterogeneous feature fusion network. The foundation cloud picture classification method comprises the following steps: (1) preprocessing a plurality of foundation cloud pictures with noise; (2) features of the processed foundation cloud images are extracted respectively, and a manual feature extraction method is combined with Fisher vector coding to obtain a feature vector corresponding to each foundation cloud image; (3) inputting the output of the step (2) into a four-layer full connection layer network, wherein the output of the network is recorded as fc; and (4) after the training set is amplified, training a convolutional neural network model, fusing fc with the deep semantic feature fg obtained by the last pooling layer, and obtaining a classification probability corresponding to each class through a full connection layer. According to the method, the generalization ability of a foundation cloudpicture classification and recognition task can be remarkably improved, the robustness of the model is high, visual information is combined from multiple angles, the cloud shape can be accurately positioned even if noise is manually added, and a good recognition result is obtained.

Description

technical field [0001] The invention relates to the technical field of pattern recognition, in particular to a ground-based cloud image classification method based on a heterogeneous feature fusion network. Background technique [0002] In the field of meteorological research, comprehensive and consistent observation of clouds is very important. In meteorological forecasting, macroscopic parameters such as cloud amount and cloud type play a crucial role. Classification methods for ground-based cloud images have been extensively studied in recent decades. Traditional cloud classification methods rely on expert experience, the method is unreliable, time-consuming, and to some extent depends on the operator's experience, and the classification results usually have some uncertainty and bias. In addition, human eye observation has gradually become more costly. Therefore, an automatic and accurate cloud classification method is urgently needed. [0003] In recent years, many i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/047G06N3/045G06F18/241G06F18/2415G06F18/253
Inventor 王敏付昱承储荣朱首贤
Owner HOHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products