CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
Balanced Quantization: An Effective and Efficient Approach to Quantized Neural Networks
Zhou, Shu-Chang1,2,3; Wang, Yu-Zhi3,4; Wen, He3,5; He, Qin-Yao3,5; Zou, Yu-Heng5,6
2017-07-01
发表期刊JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY
ISSN1000-9000
卷号32期号:4页码:667-682
摘要Quantized neural networks (QNNs), which use low bitwidth numbers for representing parameters and performing computations, have been proposed to reduce the computation complexity, storage size and memory usage. In QNNs, parameters and activations are uniformly quantized, such that the multiplications and additions can be accelerated by bitwise operations. However, distributions of parameters in neural networks are often imbalanced, such that the uniform quantization determined from extremal values may underutilize available bitwidth. In this paper, we propose a novel quantization method that can ensure the balance of distributions of quantized values. Our method first recursively partitions the parameters by percentiles into balanced bins, and then applies uniform quantization. We also introduce computationally cheaper approximations of percentiles to reduce the computation overhead introduced. Overall, our method improves the prediction accuracies of QNNs without introducing extra computation during inference, has negligible impact on training speed, and is applicable to both convolutional neural networks and recurrent neural networks. Experiments on standard datasets including ImageNet and Penn Treebank confirm the effectiveness of our method. On ImageNet, the top-5 error rate of our 4-bit quantized GoogLeNet model is 12.7%, which is superior to the state-of-the-arts of QNNs.
关键词quantized neural network percentile histogram equalization uniform quantization
DOI10.1007/s11390-017-1750-y
收录类别SCI
语种英语
WOS研究方向Computer Science
WOS类目Computer Science, Hardware & Architecture ; Computer Science, Software Engineering
WOS记录号WOS:000405580700002
出版者SCIENCE PRESS
引用统计
被引频次:60[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/7021
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Zhou, Shu-Chang
作者单位1.Univ Chinese Acad Sci, Beijing 100049, Peoples R China
2.Chinese Acad Sci, Inst Comp Technol, State Key Lab Comp Architecture, Beijing 100190, Peoples R China
3.Megvii Inc, Beijing 100190, Peoples R China
4.Tsinghua Univ, Dept Elect Engn, Beijing 100084, Peoples R China
5.Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
6.Peking Univ, Sch Elect Engn & Comp Sci, Beijing 100871, Peoples R China
推荐引用方式
GB/T 7714
Zhou, Shu-Chang,Wang, Yu-Zhi,Wen, He,et al. Balanced Quantization: An Effective and Efficient Approach to Quantized Neural Networks[J]. JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY,2017,32(4):667-682.
APA Zhou, Shu-Chang,Wang, Yu-Zhi,Wen, He,He, Qin-Yao,&Zou, Yu-Heng.(2017).Balanced Quantization: An Effective and Efficient Approach to Quantized Neural Networks.JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY,32(4),667-682.
MLA Zhou, Shu-Chang,et al."Balanced Quantization: An Effective and Efficient Approach to Quantized Neural Networks".JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY 32.4(2017):667-682.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Zhou, Shu-Chang]的文章
[Wang, Yu-Zhi]的文章
[Wen, He]的文章
百度学术
百度学术中相似的文章
[Zhou, Shu-Chang]的文章
[Wang, Yu-Zhi]的文章
[Wen, He]的文章
必应学术
必应学术中相似的文章
[Zhou, Shu-Chang]的文章
[Wang, Yu-Zhi]的文章
[Wen, He]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。