Institute of Computing Technology, Chinese Academy IR
Neural Network Pruning by Recurrent Weights for Finance Market | |
Pei, Songwen1,2; Wu, Yusheng3; Guo, Jin4; Qiu, Meikang5 | |
2022-08-01 | |
发表期刊 | ACM TRANSACTIONS ON INTERNET TECHNOLOGY
![]() |
ISSN | 1533-5399 |
卷号 | 22期号:3页码:23 |
摘要 | Convolutional Neural Networks (CNNs) and deep learning technology are applied in current financial market to rapidly promote the development of finance market and Internet economy. The continuous development of neural networks with more hidden layers improves the performance but increases the computational complexity. Generally, channel pruning methods are useful to compact neural networks. However, typical channel pruning methods would remove layers by mistake due to the static pruning ratio of manual setting, which could destroy the whole structure of neural networks. It is difficult to improve the ratio of compressing neural networks only by pruning channels while maintaining good network structures. Therefore, we propose a novel neural Networks Pruning by Recurrent Weights (NPRW) that can repeatedly evaluate the significance of weights and adaptively adjust them to compress neural networks within acceptable loss of accuracy. The recurrent weights with low sensitivity are compulsorily set to zero by evaluating the magnitude of weights, and pruned network only uses a few significant weights. Then, we add the regularization to the scaling factors on neural networks, in which recurrent weights with high sensitivity can be dynamically updated and weights of low sensitivity stay at zero invariably. By this way, the significance of channels can be quantitatively evaluated by recurrent weights. It has been verified with typical neural networks of LeNet, VGGNet, and ResNet on multiple benchmark datasets involving stock index futures, digital recognition, and image classification. The pruned LeNet-5 achieves the 58.9% reduction amount of parameters with 0.29% loss of total accuracy for Shanghai and Shenzhen 300 stock index futures. As for the CIFAR-10, the pruned VGG-19 reduces more than 50% FLOPs, and the decrease of network accuracy is less than 0.5%. In addition, the pruned ResNet-164 tested on the SVHN reduces more than 58% FLOPs with relative improvement on accuracy by 0.11%. |
关键词 | Channel pruning neural networks recurrent weights finance market |
DOI | 10.1145/3433547 |
收录类别 | SCI |
语种 | 英语 |
资助项目 | National Natural Science Foundation of China[61975124] ; Shanghai Natural Science Foundation[20ZR1438500] ; Open Project Program of Shanghai Key Laboratory of Data Science[2020090600003] ; State Key Lab of Computer Architecture, ICT, CAS[CARCHA202111] |
WOS研究方向 | Computer Science |
WOS类目 | Computer Science, Information Systems ; Computer Science, Software Engineering |
WOS记录号 | WOS:000844323400003 |
出版者 | ASSOC COMPUTING MACHINERY |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://119.78.100.204/handle/2XEOYT63/19439 |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Pei, Songwen |
作者单位 | 1.Univ Shanghai Sci & Technol, Shanghai 200093, Peoples R China 2.Chinese Acad Sci, ICT, Beijing 100190, Peoples R China 3.Univ Shanghai Sci & Technol, Dept Comp Sci & Engn, Shanghai 200093, Peoples R China 4.Peking Univ, Sch Econ, Beijing 100871, Peoples R China 5.Texas A&M Univ, Dept Comp Sci, Commerce, TX 75428 USA |
推荐引用方式 GB/T 7714 | Pei, Songwen,Wu, Yusheng,Guo, Jin,et al. Neural Network Pruning by Recurrent Weights for Finance Market[J]. ACM TRANSACTIONS ON INTERNET TECHNOLOGY,2022,22(3):23. |
APA | Pei, Songwen,Wu, Yusheng,Guo, Jin,&Qiu, Meikang.(2022).Neural Network Pruning by Recurrent Weights for Finance Market.ACM TRANSACTIONS ON INTERNET TECHNOLOGY,22(3),23. |
MLA | Pei, Songwen,et al."Neural Network Pruning by Recurrent Weights for Finance Market".ACM TRANSACTIONS ON INTERNET TECHNOLOGY 22.3(2022):23. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论