Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A convolutional echo state network time sequence classification method based on a multi-head self-attention mechanism

A technology of echo state network and classification method, applied in the field of reserve pool computing and neural network research, can solve problems such as affecting model performance, inability to obtain performance, etc., to achieve the effect of reservation-free training

Inactive Publication Date: 2019-06-21
SOUTH CHINA UNIV OF TECH
View PDF0 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although these works have made some progress, the echo state features generated by the traditional echo state network can only capture short-term information representation, and the loss of feature information in the global space will affect the performance of these models in time series classification to a certain extent. Performance in
Therefore, classification methods based on traditional echo state networks cannot achieve satisfactory performance in some complex time series modeling tasks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A convolutional echo state network time sequence classification method based on a multi-head self-attention mechanism
  • A convolutional echo state network time sequence classification method based on a multi-head self-attention mechanism
  • A convolutional echo state network time sequence classification method based on a multi-head self-attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0052] Such as figure 1 As shown, this embodiment discloses a multi-variable time-series classification method based on the multi-head self-attention mechanism of the convolutional echo state network. This method introduces a multi-head self-attention mapping mechanism into the traditional echo state network, performs multiple high-dimensional projections on the input time series and integrates high-dimensional feature global space-time encoding, and realizes the capture of complex time series features. Finally, a shallow layer The convolutional neural network achieves high-precision classification. The convolutional echo state network model based on the multi-head self-attention mechanism is a new type of reserve pool calculation model applied to time series classification. The model establishment process is as follows: figure 2 shown, including the following steps:

[0053] S1. Network initialization, determine the number of reserve pools, and initialize the internal para...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a convolutional echo state network time sequence classification method based on a multi-head self-attention mechanism. The method is realized based on two components of an encoder and a decoder. The encoder is composed of an echo state network of a multi-head self-attention mechanism. An input time sequence is firstly subjected to high-dimensional mapping coding through a plurality of echo state networks to generate original echo state characteristics, and then re-coding of echo state characteristic global space-time information is achieved based on a self-attention mechanism, so that the re-integrated high-dimensional characteristic representation has higher discrimination capability; And finally, taking a shallow convolutional neural network as a decoder to realize high-precision classification. The classification model established by the method inherits the characteristic of high training efficiency of the echo state network, realizes the re-coding of the space-time characteristic information by introducing a self-attention mechanism model without additional parameters, achieves a high-precision time sequence classification effect, and is a simple and efficient model.

Description

technical field [0001] The invention relates to the technical fields of reserve pool calculation and neural network research, in particular to a method for time series classification of convolutional echo state networks based on a multi-head self-attention mechanism. Background technique [0002] As one of the recurrent neural networks, the echo state network plays a very important role in the field of time series prediction. The general architecture of traditional echo state networks mainly consists of three components: input layer, hidden layer and output layer. Thanks to the memory capability of the high-dimensional reserve pool, for simple time series forecasting, the echo state network can usually achieve good forecasting performance from simple ridge regression. [0003] As the application of echo state network in time series prediction continues to expand, some researchers try to extend echo state network to time series classification tasks. It is worth noting that ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06N3/04G06N3/08
Inventor 马千里李森黄德森
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products