Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Prosthetic hand control method and device based on facial expression-driven EEG signals

A technology of facial expressions and control methods, which is applied in the field of intelligent robots, can solve the problems of affecting control accuracy, lack of functional functions, and low control accuracy, so as to increase the degree of freedom of thumb side swing, improve flexible use of space, and ensure high precision. Effect

Active Publication Date: 2017-04-26
XI AN JIAOTONG UNIV
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The earliest prosthetic hands in my country were mainly decorative and did not have too many functional functions, so they could not assist the disabled to complete basic life
Subsequently, decorative prosthetic hands were gradually replaced by mechanical prosthetic hands with simple functions. These prosthetic hands work through pre-edited programs. Their functions are simple and cannot work according to the user's wishes, and they do not have a tactile feedback mechanism
With the development of biotechnology, computer technology, and microelectronics technology, scholars at home and abroad have recently launched the use of bioelectrical signals as the control source signal source of prosthetic hands. Most of these prosthetic hands use myoelectric signals as the control signal source, but for a long time The use of muscle fatigue affects the accuracy of control, and this type of prosthetic hand is not suitable for some patients who lack muscle control, such as patients with amyotrophic lateral sclerosis
[0003] The brain-controlled prosthetic hand solves the problem of low control accuracy caused by muscle fatigue in the traditional myoelectric prosthetic hand by directly establishing an "external channel" between the brain and the prosthetic hand. The EEG signal is used as the control source, and there is no in-depth study on the EEG signal driven by facial expressions as the control signal source of the prosthetic hand

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Prosthetic hand control method and device based on facial expression-driven EEG signals
  • Prosthetic hand control method and device based on facial expression-driven EEG signals
  • Prosthetic hand control method and device based on facial expression-driven EEG signals

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] refer to figure 1 , image 3 (b), the prosthetic hand control device involved in the present invention includes an EEG signal acquisition module 310 placed on the subject's head, preferably a portable 16-channel wireless EEG acquisition device, which is located in the prefrontal cortex under the international standard 10 / 20 The EEG signals of FC5, FC6 and F7, F8 in the limbic system. The EEG signal collection module amplifies and filters the collected EEG signals and sends them to the portable signal processing module 330 through the Bluetooth transmission module 320 .

[0027] When the device starts to work, the EEG acquisition signal collection module collects the EEG signals generated by the subject’s expression drive, amplifies and filters them, and then transmits them to the signal processing module through a Bluetooth technology. The signal processing module is responsible for processing the EEG signals. Feature extraction and pattern recognition, the recognitio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an artificial hand control method and apparatus for driving EEG signals on the basis of facial expressions. The method is characterized by comprising the following steps: a subject, according to four basic actions of an artificial hand, making corresponding four simple facial expressions; extracting the EEG signals through an EEG acquisition module, transmitting the EEG signals to a portable signal processing module through a Bluetooth transmission module, performing characteristic extraction and mode identification on the received EEG signals, transmitting an identification result to a control driving module through a wireless communication module, and driving a motor to finish artificial hand target actions through simulating the control driving module in an artificial arm cylinder.

Description

technical field [0001] The invention relates to the field of intelligent robots, in particular to a method and a device for controlling an intelligent prosthetic hand by an electroencephalogram signal. Background technique [0002] Due to the frequent occurrence of traffic accidents, industrial injuries, diseases and other reasons, the number of upper limb disabilities is gradually increasing. According to the adjusted data of my country's disabled people in 2006, the total number of disabled people in China is 82.96 million, of which 24.12 million are physically disabled, accounting for 29.07% of the total number of disabled people. The earliest prosthetic hands in my country were mainly decorative and did not have too many functional functions, so they could not assist the disabled to complete their basic life. Subsequently, decorative prosthetic hands were gradually replaced by mechanical prosthetic hands with simple functions. These prosthetic hands work through pre-edi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): A61F2/72
Inventor 张小栋李睿刘畅陈江城田艳举张黎明陆竹风
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products