Institute of Computing Technology, Chinese Academy IR
Exploring Winograd Convolution for Cost-Effective Neural Network Fault Tolerance | |
Xue, Xinghua1,2; Liu, Cheng1,2; Liu, Bo3; Huang, Haitong1,2; Wang, Ying1,2; Luo, Tao4; Zhang, Lei1,2; Li, Huawei1,2; Li, Xiaowei1,2 | |
2023-11-01 | |
发表期刊 | IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS |
ISSN | 1063-8210 |
卷号 | 31期号:11页码:1763-1773 |
摘要 | Winograd is generally utilized to optimize convolution performance and computational efficiency because of the reduced multiplication operations, but the reliability issues brought by winograd are usually overlooked. In this work, we observe the great potential of winograd convolution (WG-Conv) in improving neural network (NN) fault tolerance. Based on the observation, we evaluate WG-Conv fault tolerance comprehensively from different granularities ranging from models, layers, and operation types for the first time. Then, we explore the use of inherent fault tolerance of WG-Conv for cost-effective NN protection against soft errors. Specifically, we mainly investigate how WG-Conv can be effectively incorporated with classical fault-tolerant design approaches including triple modular redundancy (TMR), fault-aware retraining, and constrained activation functions. According to our experiments, WG-Conv can reduce the fault-tolerant design overhead by 55.77% on average without any accuracy loss compared to standard convolution (ST-Conv), and further reduce the computing overhead by 17.24% when the inherent fault tolerance of WG-Conv is considered. When it is applied on fault-tolerant NNs enhanced with fault-aware retraining and constrained activation functions, the resulting model accuracy generally shows significant improvement in the presence of various faults. |
关键词 | Fault tolerant systems Fault tolerance Artificial neural networks Convolution Reliability Computational modeling Neurons Fault-tolerance soft errors vulnerability analysis winograd convolution (WG-Conv) |
DOI | 10.1109/TVLSI.2023.3306894 |
收录类别 | SCI |
语种 | 英语 |
资助项目 | National Natural Science Foundation of China[62174162] ; Space Trusted Computing and Electronic Information Technology Laboratory of Beijing Institute of Control Engineering (BICE)[OBCandETL- 2022-07] |
WOS研究方向 | Computer Science ; Engineering |
WOS类目 | Computer Science, Hardware & Architecture ; Engineering, Electrical & Electronic |
WOS记录号 | WOS:001179765700002 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://119.78.100.204/handle/2XEOYT63/38821 |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Liu, Cheng |
作者单位 | 1.Chinese Acad Sci, Inst Comp Technol, State Key Lab Processors, Beijing 100190, Peoples R China 2.Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing 100190, Peoples R China 3.Beijing Inst Control Engn, Beijing 100190, Peoples R China 4.ASTAR, Inst High Performance Comp, Singapore 138632, Singapore |
推荐引用方式 GB/T 7714 | Xue, Xinghua,Liu, Cheng,Liu, Bo,et al. Exploring Winograd Convolution for Cost-Effective Neural Network Fault Tolerance[J]. IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS,2023,31(11):1763-1773. |
APA | Xue, Xinghua.,Liu, Cheng.,Liu, Bo.,Huang, Haitong.,Wang, Ying.,...&Li, Xiaowei.(2023).Exploring Winograd Convolution for Cost-Effective Neural Network Fault Tolerance.IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS,31(11),1763-1773. |
MLA | Xue, Xinghua,et al."Exploring Winograd Convolution for Cost-Effective Neural Network Fault Tolerance".IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS 31.11(2023):1763-1773. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论