CSpace  > 中国科学院计算技术研究所期刊论文
EEGDnet: Fusing non-local and local self-similarity for EEG signal denoising with transformer
Pu, Xiaorong1; Yi, Peng2; Chen, Kecheng4; Ma, Zhaoqi2; Zhao, Di3; Ren, Yazhou1
2022-12-01
发表期刊COMPUTERS IN BIOLOGY AND MEDICINE
ISSN0010-4825
卷号151页码:8
摘要Electroencephalogram (EEG) has shown a useful approach to produce a brain-computer interface (BCI). One-dimensional (1-D) EEG signal is yet easily disturbed by certain artifacts (a.k.a. noise) due to the high temporal resolution. Thus, it is crucial to remove the noise in received EEG signal. Recently, deep learning-based EEG signal denoising approaches have achieved impressive performance compared with traditional ones. It is well known that the characteristics of self-similarity (including non-local and local ones) of data (e.g., natural images and time-domain signals) are widely leveraged for denoising. However, existing deep learning-based EEG signal denoising methods ignore either the non-local self-similarity (e.g., 1-D convolutional neural network) or local one (e.g., fully connected network and recurrent neural network). To address this issue, we propose a novel 1-D EEG signal denoising network with 2-D transformer, namely EEGDnet. Specifically, we comprehensively take into account the non-local and local self-similarity of EEG signal through the transformer module. By fusing non-local self-similarity in self-attention blocks and local self-similarity in feed forward blocks, the negative impact caused by noises and outliers can be reduced significantly. Extensive experiments show that, compared with other state-of-the-art models, EEGDnet achieves much better performance in terms of both quantitative and qualitative metrics. Specifically, EEGDnet can achieve 18% and 11% improvements in correlation coefficients when removing ocular artifacts and muscle artifacts, respectively.
关键词Electroencephalography Artifact removal Transformer
DOI10.1016/j.compbiomed.2022.106248
收录类别SCI
语种英语
资助项目Open Foundation of Nuclear Medicine Laboratory of Mianyang Central Hospital[2021HYX017] ; Sichuan Science and Technology Program[2021YFS0172] ; Sichuan Science and Technology Program[2022YFS0047] ; Sichuan Science and Technology Program[2022YFS0055] ; Clinical Research Incubation Project, West China Hospital, Sichuan University[2021HXFH004] ; Medico-Engineering Cooperation Funds from University of Electronic Science and Technology of China[ZYGX2021YGLH022]
WOS研究方向Life Sciences & Biomedicine - Other Topics ; Computer Science ; Engineering ; Mathematical & Computational Biology
WOS类目Biology ; Computer Science, Interdisciplinary Applications ; Engineering, Biomedical ; Mathematical & Computational Biology
WOS记录号WOS:000900186300008
出版者PERGAMON-ELSEVIER SCIENCE LTD
引用统计
被引频次:14[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/20131
专题中国科学院计算技术研究所期刊论文
通讯作者Chen, Kecheng; Ren, Yazhou
作者单位1.Univ Elect Sci & Technol China UESTC, Sch Comp Sci & Engn, Chengdu 611731, Peoples R China
2.Univ Elect Sci & Technol China UESTC, Sch Informat & Commun Engn, Chengdu 611731, Peoples R China
3.Chinese Acad Sci, Inst Comp Technol, Beijing 100080, Peoples R China
4.City Univ Hong Kong, Dept Elect Engn, Hong Kong 999077, Peoples R China
推荐引用方式
GB/T 7714
Pu, Xiaorong,Yi, Peng,Chen, Kecheng,et al. EEGDnet: Fusing non-local and local self-similarity for EEG signal denoising with transformer[J]. COMPUTERS IN BIOLOGY AND MEDICINE,2022,151:8.
APA Pu, Xiaorong,Yi, Peng,Chen, Kecheng,Ma, Zhaoqi,Zhao, Di,&Ren, Yazhou.(2022).EEGDnet: Fusing non-local and local self-similarity for EEG signal denoising with transformer.COMPUTERS IN BIOLOGY AND MEDICINE,151,8.
MLA Pu, Xiaorong,et al."EEGDnet: Fusing non-local and local self-similarity for EEG signal denoising with transformer".COMPUTERS IN BIOLOGY AND MEDICINE 151(2022):8.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Pu, Xiaorong]的文章
[Yi, Peng]的文章
[Chen, Kecheng]的文章
百度学术
百度学术中相似的文章
[Pu, Xiaorong]的文章
[Yi, Peng]的文章
[Chen, Kecheng]的文章
必应学术
必应学术中相似的文章
[Pu, Xiaorong]的文章
[Yi, Peng]的文章
[Chen, Kecheng]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。