CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
Neural Network Pruning by Recurrent Weights for Finance Market
Pei, Songwen1,2; Wu, Yusheng3; Guo, Jin4; Qiu, Meikang5
2022-08-01
发表期刊ACM TRANSACTIONS ON INTERNET TECHNOLOGY
ISSN1533-5399
卷号22期号:3页码:23
摘要Convolutional Neural Networks (CNNs) and deep learning technology are applied in current financial market to rapidly promote the development of finance market and Internet economy. The continuous development of neural networks with more hidden layers improves the performance but increases the computational complexity. Generally, channel pruning methods are useful to compact neural networks. However, typical channel pruning methods would remove layers by mistake due to the static pruning ratio of manual setting, which could destroy the whole structure of neural networks. It is difficult to improve the ratio of compressing neural networks only by pruning channels while maintaining good network structures. Therefore, we propose a novel neural Networks Pruning by Recurrent Weights (NPRW) that can repeatedly evaluate the significance of weights and adaptively adjust them to compress neural networks within acceptable loss of accuracy. The recurrent weights with low sensitivity are compulsorily set to zero by evaluating the magnitude of weights, and pruned network only uses a few significant weights. Then, we add the regularization to the scaling factors on neural networks, in which recurrent weights with high sensitivity can be dynamically updated and weights of low sensitivity stay at zero invariably. By this way, the significance of channels can be quantitatively evaluated by recurrent weights. It has been verified with typical neural networks of LeNet, VGGNet, and ResNet on multiple benchmark datasets involving stock index futures, digital recognition, and image classification. The pruned LeNet-5 achieves the 58.9% reduction amount of parameters with 0.29% loss of total accuracy for Shanghai and Shenzhen 300 stock index futures. As for the CIFAR-10, the pruned VGG-19 reduces more than 50% FLOPs, and the decrease of network accuracy is less than 0.5%. In addition, the pruned ResNet-164 tested on the SVHN reduces more than 58% FLOPs with relative improvement on accuracy by 0.11%.
关键词Channel pruning neural networks recurrent weights finance market
DOI10.1145/3433547
收录类别SCI
语种英语
资助项目National Natural Science Foundation of China[61975124] ; Shanghai Natural Science Foundation[20ZR1438500] ; Open Project Program of Shanghai Key Laboratory of Data Science[2020090600003] ; State Key Lab of Computer Architecture, ICT, CAS[CARCHA202111]
WOS研究方向Computer Science
WOS类目Computer Science, Information Systems ; Computer Science, Software Engineering
WOS记录号WOS:000844323400003
出版者ASSOC COMPUTING MACHINERY
引用统计
被引频次:3[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/19439
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Pei, Songwen
作者单位1.Univ Shanghai Sci & Technol, Shanghai 200093, Peoples R China
2.Chinese Acad Sci, ICT, Beijing 100190, Peoples R China
3.Univ Shanghai Sci & Technol, Dept Comp Sci & Engn, Shanghai 200093, Peoples R China
4.Peking Univ, Sch Econ, Beijing 100871, Peoples R China
5.Texas A&M Univ, Dept Comp Sci, Commerce, TX 75428 USA
推荐引用方式
GB/T 7714
Pei, Songwen,Wu, Yusheng,Guo, Jin,et al. Neural Network Pruning by Recurrent Weights for Finance Market[J]. ACM TRANSACTIONS ON INTERNET TECHNOLOGY,2022,22(3):23.
APA Pei, Songwen,Wu, Yusheng,Guo, Jin,&Qiu, Meikang.(2022).Neural Network Pruning by Recurrent Weights for Finance Market.ACM TRANSACTIONS ON INTERNET TECHNOLOGY,22(3),23.
MLA Pei, Songwen,et al."Neural Network Pruning by Recurrent Weights for Finance Market".ACM TRANSACTIONS ON INTERNET TECHNOLOGY 22.3(2022):23.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Pei, Songwen]的文章
[Wu, Yusheng]的文章
[Guo, Jin]的文章
百度学术
百度学术中相似的文章
[Pei, Songwen]的文章
[Wu, Yusheng]的文章
[Guo, Jin]的文章
必应学术
必应学术中相似的文章
[Pei, Songwen]的文章
[Wu, Yusheng]的文章
[Guo, Jin]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。