Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Motor imagery classification method based on convolutional neural network

A convolutional neural network and motion imagery technology, applied in the information field, can solve the problems of affecting the classification results and the inability to fully extract the characteristics of EEG signals, so as to improve the recognition accuracy and have the effect of portability

Active Publication Date: 2020-02-07
XIDIAN UNIV
View PDF6 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since the existing neural network methods use a single time convolution kernel in the time domain convolution, it is impossible to fully extract the characteristics of the EEG signal, which affects the final classification result.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motor imagery classification method based on convolutional neural network
  • Motor imagery classification method based on convolutional neural network
  • Motor imagery classification method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] Embodiments of the present invention are described in detail below in conjunction with the accompanying drawings:

[0038] refer to figure 1 , this embodiment is divided into two parts, the first part is to generate the final convolutional neural network, and the second part is to use the network for online experiments. The specific implementation steps are as follows:

[0039] 1. Generate the final convolutional neural network

[0040] Step 1, collecting imaginative exercise EEG data.

[0041] refer to figure 2 , the specific implementation of this step is as follows:

[0042] (1a) Experimental paradigm:

[0043] The subjects conducted the experiment according to the paradigm of imaginary movement experiment, and there were 4 states in each experiment according to the time sequence, namely: preparation state, imaginary movement state, intermittent state, and waiting state, among which:

[0044] In the ready state, first the crosshairs will appear on the screen to...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a convolutional neural network method based on a parallel multi-scale time convolution kernel, and mainly solves the problems of low detection accuracy and difficulty in effectively detecting imaginary movement of a user in the prior art. According to the implementation scheme, the method comprises the following steps: collecting imaginary motion electroencephalogram data,preprocessing the imaginary motion electroencephalogram data, and making a data set by using the preprocessed electroencephalogram data; constructing a convolutional neural network, training the convolutional neural network by using the training set and the verification set, testing the convolutional neural network by using the test set, and performing fine tuning on the tested convolutional neural network by using the electroencephalogram data of the testee to obtain a final convolutional neural network suitable for the testee to perform an online experiment; and obtaining an online imagination motion electroencephalogram signal of the testee in real time, and sending the online imagination motion electroencephalogram signal to the final convolutional neural network to obtain a real-timeclassification result. The method can effectively detect the imaginary movement of the user, improves the classification accuracy of the imaginary movement electroencephalogram signals, can be used for medical service, and serves as an auxiliary tool to participate in rehabilitation treatment of stroke patients.

Description

technical field [0001] The invention belongs to the field of information technology, in particular to a classification method of electroencephalogram signals, which can be used in medical services. Background technique [0002] EEG signals are generated by the bioelectrical activity of neurons in the brain, which belongs to spontaneous potential activity. EEG signals are usually divided into different rhythms according to the frequency bands: δ, θ, α, β, and γ. EEG signals of different rhythms can reflect different physiological and psychological state information of the human body, among which: [0003] The δ rhythm is mainly located in the 1-4Hz frequency band, which mainly reflects the deep sleep state of people or special brain diseases; [0004] The θ rhythm is mainly located in the 4-8Hz frequency band, which mainly reflects the state of early sleep, meditation, sleepiness or depression; [0005] The α rhythm is mainly located in the 8-12Hz frequency band. In additio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06F2218/02G06F2218/08G06F2218/12G06F18/241
Inventor 李甫吴昊晁伟兵石光明付博勋牛毅冀有硕董明皓王晓甜
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products