Institute of Computing Technology, Chinese Academy IR
Parallel Spatial-Temporal Self-Attention CNN-Based Motor Imagery Classification for BCI | |
Liu, Xiuling1,2; Shen, Yonglong1,2; Liu, Jing2,3,4; Yang, Jianli1,2; Xiong, Peng1,2; Lin, Feng5 | |
2020-12-11 | |
发表期刊 | FRONTIERS IN NEUROSCIENCE |
卷号 | 14页码:12 |
摘要 | Motor imagery (MI) electroencephalography (EEG) classification is an important part of the brain-computer interface (BCI), allowing people with mobility problems to communicate with the outside world via assistive devices. However, EEG decoding is a challenging task because of its complexity, dynamic nature, and low signal-to-noise ratio. Designing an end-to-end framework that fully extracts the high-level features of EEG signals remains a challenge. In this study, we present a parallel spatial-temporal self-attention-based convolutional neural network for four-class MI EEG signal classification. This study is the first to define a new spatial-temporal representation of raw EEG signals that uses the self-attention mechanism to extract distinguishable spatial-temporal features. Specifically, we use the spatial self-attention module to capture the spatial dependencies between the channels of MI EEG signals. This module updates each channel by aggregating features over all channels with a weighted summation, thus improving the classification accuracy and eliminating the artifacts caused by manual channel selection. Furthermore, the temporal self-attention module encodes the global temporal information into features for each sampling time step, so that the high-level temporal features of the MI EEG signals can be extracted in the time domain. Quantitative analysis shows that our method outperforms state-of-the-art methods for intra-subject and inter-subject classification, demonstrating its robustness and effectiveness. In terms of qualitative analysis, we perform a visual inspection of the new spatial-temporal representation estimated from the learned architecture. Finally, the proposed method is employed to realize control of drones based on EEG signal, verifying its feasibility in real-time applications. |
关键词 | motor imagery EEG BCI spatial-temporal self-attention deep learning |
DOI | 10.3389/fnins.2020.587520 |
收录类别 | SCI |
语种 | 英语 |
资助项目 | National Natural Science Foundation of China[61802109] ; National Natural Science Foundation of China[61673158] ; National Natural Science Foundation of China[61703133] ; Natural Science Foundation of Hebei Province[F2020205006] ; Natural Science Foundation of Hebei Province[F2018201070] ; Top Youth Talents of Science and Technology Research Project in Hebei Province[BJ2020059] ; Youth Talent Support Program of Hebei Province[BJ2019044] ; Science Foundation of Hebei Normal University[L2018K02] |
WOS研究方向 | Neurosciences & Neurology |
WOS类目 | Neurosciences |
WOS记录号 | WOS:000601597400001 |
出版者 | FRONTIERS MEDIA SA |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://119.78.100.204/handle/2XEOYT63/16572 |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Liu, Jing |
作者单位 | 1.Hebei Univ, Coll Elect Informat Engn, Baoding, Peoples R China 2.Hebei Univ, Key Lab Digital Med Engn Hebei Prov, Baoding, Peoples R China 3.Hebei Normal Univ, Coll Comp & Cyber Secur, Shijiazhuang, Hebei, Peoples R China 4.Chinese Acad Sci, Inst Comp Technol, Beijing Key Lab Mobile Comp & Pervas Device, Beijing, Peoples R China 5.Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore, Singapore |
推荐引用方式 GB/T 7714 | Liu, Xiuling,Shen, Yonglong,Liu, Jing,et al. Parallel Spatial-Temporal Self-Attention CNN-Based Motor Imagery Classification for BCI[J]. FRONTIERS IN NEUROSCIENCE,2020,14:12. |
APA | Liu, Xiuling,Shen, Yonglong,Liu, Jing,Yang, Jianli,Xiong, Peng,&Lin, Feng.(2020).Parallel Spatial-Temporal Self-Attention CNN-Based Motor Imagery Classification for BCI.FRONTIERS IN NEUROSCIENCE,14,12. |
MLA | Liu, Xiuling,et al."Parallel Spatial-Temporal Self-Attention CNN-Based Motor Imagery Classification for BCI".FRONTIERS IN NEUROSCIENCE 14(2020):12. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论