CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
STT-RAM Buffer Design for Precision-Tunable General-Purpose Neural Network Accelerator
Song, Lili1,2; Wang, Ying1,2; Han, Yinhe1,2; Li, Huawei1,2; Cheng, Yuanqing1,2; Li, Xiaowei1,2
2017-04-01
发表期刊IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS
ISSN1063-8210
卷号25期号:4页码:1285-1296
摘要Multilevel spin toque transfer RAM (STT-RAM) is a suitable storage device for energy-efficient neural network accelerators (NNAs), which relies on large-capacity on-chip memory to support brain-inspired large-scale learning models from conventional artificial neural networks to current popular deep convolutional neural networks. In this paper, we investigate the application of multilevel STT-RAM to general-purpose NNAs. First, the error-resilience feature of neural networks is leveraged to tolerate the read/write reliability issue in multilevel cell STT-RAM using approximate computing. The induced read/write failures at the expense of higher storage density can be effectively masked by a wide spectrum of NN applications with intrinsic forgiveness. Second, we present a precision-tunable STT-RAM buffer for the popular general-purpose NNA. The targeted STT-RAM memory design is able to transform between multiple working modes and adaptable to meet the varying quality constraint of approximate applications. Lastly, the reconfigurable STT-RAM buffer not only enables precision scaling in NNA but also provides adaptiveness to the demand for different learning models with distinct working-set sizes. Particularly, we demonstrate the concept of capacity/precision-tunable STT-RAM memory with the emerging reconfigurable deep NNA and elaborate on the data mapping and storage mode switching policy in STT-RAM memory to achieve the best energy efficiency of approximate computing.
关键词Approximate computing machine learning neural network spin toque transfer RAM (STT-RAM)
DOI10.1109/TVLSI.2016.2644279
收录类别SCI
语种英语
资助项目National Natural Science Foundation of China[61504153] ; National Natural Science Foundation of China[61522406] ; National Natural Science Foundation of China[61532017] ; National Natural Science Foundation of China[61521092]
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Hardware & Architecture ; Engineering, Electrical & Electronic
WOS记录号WOS:000398858800009
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
引用统计
被引频次:10[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/7281
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Wang, Ying
作者单位1.Chinese Acad Sci, Inst Comp Technol, State Key Lab Comp Architecture, Beijing 100190, Peoples R China
2.Univ Chinese Acad Sci, Beijing 100190, Peoples R China
推荐引用方式
GB/T 7714
Song, Lili,Wang, Ying,Han, Yinhe,et al. STT-RAM Buffer Design for Precision-Tunable General-Purpose Neural Network Accelerator[J]. IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS,2017,25(4):1285-1296.
APA Song, Lili,Wang, Ying,Han, Yinhe,Li, Huawei,Cheng, Yuanqing,&Li, Xiaowei.(2017).STT-RAM Buffer Design for Precision-Tunable General-Purpose Neural Network Accelerator.IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS,25(4),1285-1296.
MLA Song, Lili,et al."STT-RAM Buffer Design for Precision-Tunable General-Purpose Neural Network Accelerator".IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS 25.4(2017):1285-1296.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Song, Lili]的文章
[Wang, Ying]的文章
[Han, Yinhe]的文章
百度学术
百度学术中相似的文章
[Song, Lili]的文章
[Wang, Ying]的文章
[Han, Yinhe]的文章
必应学术
必应学术中相似的文章
[Song, Lili]的文章
[Wang, Ying]的文章
[Han, Yinhe]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。