Institute of Computing Technology, Chinese Academy IR
STT-RAM Buffer Design for Precision-Tunable General-Purpose Neural Network Accelerator | |
Song, Lili1,2; Wang, Ying1,2; Han, Yinhe1,2; Li, Huawei1,2; Cheng, Yuanqing1,2; Li, Xiaowei1,2 | |
2017-04-01 | |
发表期刊 | IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS |
ISSN | 1063-8210 |
卷号 | 25期号:4页码:1285-1296 |
摘要 | Multilevel spin toque transfer RAM (STT-RAM) is a suitable storage device for energy-efficient neural network accelerators (NNAs), which relies on large-capacity on-chip memory to support brain-inspired large-scale learning models from conventional artificial neural networks to current popular deep convolutional neural networks. In this paper, we investigate the application of multilevel STT-RAM to general-purpose NNAs. First, the error-resilience feature of neural networks is leveraged to tolerate the read/write reliability issue in multilevel cell STT-RAM using approximate computing. The induced read/write failures at the expense of higher storage density can be effectively masked by a wide spectrum of NN applications with intrinsic forgiveness. Second, we present a precision-tunable STT-RAM buffer for the popular general-purpose NNA. The targeted STT-RAM memory design is able to transform between multiple working modes and adaptable to meet the varying quality constraint of approximate applications. Lastly, the reconfigurable STT-RAM buffer not only enables precision scaling in NNA but also provides adaptiveness to the demand for different learning models with distinct working-set sizes. Particularly, we demonstrate the concept of capacity/precision-tunable STT-RAM memory with the emerging reconfigurable deep NNA and elaborate on the data mapping and storage mode switching policy in STT-RAM memory to achieve the best energy efficiency of approximate computing. |
关键词 | Approximate computing machine learning neural network spin toque transfer RAM (STT-RAM) |
DOI | 10.1109/TVLSI.2016.2644279 |
收录类别 | SCI |
语种 | 英语 |
资助项目 | National Natural Science Foundation of China[61504153] ; National Natural Science Foundation of China[61522406] ; National Natural Science Foundation of China[61532017] ; National Natural Science Foundation of China[61521092] |
WOS研究方向 | Computer Science ; Engineering |
WOS类目 | Computer Science, Hardware & Architecture ; Engineering, Electrical & Electronic |
WOS记录号 | WOS:000398858800009 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://119.78.100.204/handle/2XEOYT63/7281 |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Wang, Ying |
作者单位 | 1.Chinese Acad Sci, Inst Comp Technol, State Key Lab Comp Architecture, Beijing 100190, Peoples R China 2.Univ Chinese Acad Sci, Beijing 100190, Peoples R China |
推荐引用方式 GB/T 7714 | Song, Lili,Wang, Ying,Han, Yinhe,et al. STT-RAM Buffer Design for Precision-Tunable General-Purpose Neural Network Accelerator[J]. IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS,2017,25(4):1285-1296. |
APA | Song, Lili,Wang, Ying,Han, Yinhe,Li, Huawei,Cheng, Yuanqing,&Li, Xiaowei.(2017).STT-RAM Buffer Design for Precision-Tunable General-Purpose Neural Network Accelerator.IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS,25(4),1285-1296. |
MLA | Song, Lili,et al."STT-RAM Buffer Design for Precision-Tunable General-Purpose Neural Network Accelerator".IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS 25.4(2017):1285-1296. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论