Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Character input method based on combination of brain signals and voice

A character input and voice input technology, which is applied in the field of human-computer interaction, can solve the problems of reduced operation accuracy, long time-consuming input of a Chinese character, and unguaranteed accuracy.

Active Publication Date: 2021-04-23
SOUTH CHINA UNIV OF TECH
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The invention with the application number 201110269640.9 "A Design Scheme of a Virtual Keyboard for Inputting Chinese Characters Using Brain Waves" and the invention with the application number 201710582561.0 "Efficient Brain-Controlled Chinese Input Method Based on Motor Visual Evoked Potentials" both simply use brain-computer interface technology To realize character input, for disabled patients, the character input method implemented by using brain-computer interface technology has the following disadvantages: (1) Chinese input needs to use pinyin or strokes for multi-step spelling input, the steps are cumbersome, and it takes a long time to input a Chinese character. Long, timeliness cannot be guaranteed
(2) Long-term operation will make the user feel tired, and the accuracy of the user's operation in the fatigued state will also decrease accordingly. The input is unstable and the accuracy cannot be guaranteed.
Due to the limitations of accuracy and timeliness in character input purely relying on brain-computer interface technology, although pure speech recognition technology has high input efficiency, there are also problems with accuracy.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Character input method based on combination of brain signals and voice
  • Character input method based on combination of brain signals and voice
  • Character input method based on combination of brain signals and voice

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0026] see Figure 1 to Figure 6 , the present invention provides a technical solution: a character input method based on the combination of brain signals and voice, the system is composed of four parts: an evoked stimulus module, a signal acquisition module, a signal analysis module and a control module; it is characterized in that: the method Specific steps are as follows:

[0027] (1) System initialization: the user puts on the electrode cap, puts on the conductive paste, connects the electrode cap to the amplifier, connects the amplifier to the computer, starts the EEG acquisition software, and sets the parameters; connects the microphone to the computer;

[0028] (2) Brain signal input: start the stimulation paradigm interface of the evoked stimulation module, and start the collection of training data; during the spelling process of each character, the P300 button will flash n rounds (round), and there are 44 P300 buttons in each round It will flash once in a random orde...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a character input method based on combination of brain signals and voice. The method comprises the following specific steps: system initialization: a user wears an electrode cap, applies conductive paste, connects the electrode cap with an amplifier, connects the amplifier to a computer, starts electroencephalogram acquisition software, and sets parameters; then the microphone is connected to a computer; brain signal input: a stimulation normal form interface of an induction stimulation module is started and collection of training data is carried out, wherein in the spelling process of each character, the P300 buttons flicker by n rounds, wherein 44 P300 buttons in each round flicker once according to a random sequence. According to the method, the defects of an existing character input method in the brain-computer field are overcome, a brain signal and voice signal input method is adopted, the brain-computer interface technology and the voice recognition technology are combined, the flexibility of the character input method is greatly improved, and meanwhile the character input efficiency is greatly improved.

Description

technical field [0001] The invention belongs to the technical field of human-computer interaction, and in particular relates to a character input method based on the combination of brain signals and speech. Background technique [0002] At present, the society pays more and more attention to the disabled, and how to improve the quality of life of the disabled has become a hot topic of discussion. In order to help disabled patients to better interact with the outside world, the current character input methods are investigated and analyzed as follows: [0003] The invention with the application number 201110269640.9 "A Design Scheme of a Virtual Keyboard for Inputting Chinese Characters Using Brain Waves" and the invention with the application number 201710582561.0 "Efficient Brain-Controlled Chinese Input Method Based on Motor Visual Evoked Potentials" both simply use brain-computer interface technology To realize character input, for disabled patients, the character input m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00G10L15/22G10L15/26
CPCG06F3/015G10L15/22G10L2015/223G06F2218/08G06F2218/12Y02D10/00
Inventor 李远清高天毅瞿军
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products