CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
Fpar: filter pruning via attention and rank enhancement for deep convolutional neural networks acceleration
Chen, Yanming1; Wu, Gang1; Shuai, Mingrui1; Lou, Shubin1; Zhang, Yiwen1; An, Zhulin2
2024-01-29
发表期刊INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS
ISSN1868-8071
页码13
摘要Pruning deep neural networks is crucial for enabling their deployment on resource-constrained edge devices, where the vast number of parameters and computational requirements pose significant challenges. However, many of these methods consider only the importance of a single filter to the network and neglect the correlation between filters. To solve this problem, we propose a novel filter pruning method, called Filter Pruning via Attention and Rank Enhancement for Deep Convolutional Neural Networks Acceleration (FPAR), based on the attention mechanism and rank of feature maps. Moreover, the inspiration for it comes from a discovery: for a network with attention modules, irrespective of the batch of input images, the mean of channel-wise weights of the attention module is almost constant. Thus, we can use a few batches of input data to obtain this indicator to guide pruning. A large number of experiments have proved that our method outperforms the most advanced methods with similar accuracy. For example, using VGG-16, our method removes 62.8% of floating-point operations (FLOPs) even with a 0.24% of the accuracy increase on CIFAR-10. With ResNet-110, our FPAR method can reduce FLOPs by 61.7% by removing 62.7% of the parameters, with slight improvement of 0.05% in the top 1 accuracy on CIFAR-10.
关键词Neural network Model compression Filter pruning Attention Rank enhancement CNNs
DOI10.1007/s13042-023-02076-1
收录类别SCI
语种英语
资助项目National Science Foundation of China (NSFC)[62262067] ; Key Natural Science Foundation of Education Department of Anhui[KJ2021A0046]
WOS研究方向Computer Science
WOS类目Computer Science, Artificial Intelligence
WOS记录号WOS:001150668700003
出版者SPRINGER HEIDELBERG
引用统计
被引频次:2[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/38376
专题中国科学院计算技术研究所期刊论文_英文
通讯作者An, Zhulin
作者单位1.Anhui Univ, Sch Compute Sci & Technol, Hefei 230000, Anhui, Peoples R China
2.Chinese Acad Sci, Inst Comp Technol, Beijing 100000, Peoples R China
推荐引用方式
GB/T 7714
Chen, Yanming,Wu, Gang,Shuai, Mingrui,et al. Fpar: filter pruning via attention and rank enhancement for deep convolutional neural networks acceleration[J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS,2024:13.
APA Chen, Yanming,Wu, Gang,Shuai, Mingrui,Lou, Shubin,Zhang, Yiwen,&An, Zhulin.(2024).Fpar: filter pruning via attention and rank enhancement for deep convolutional neural networks acceleration.INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS,13.
MLA Chen, Yanming,et al."Fpar: filter pruning via attention and rank enhancement for deep convolutional neural networks acceleration".INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS (2024):13.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Chen, Yanming]的文章
[Wu, Gang]的文章
[Shuai, Mingrui]的文章
百度学术
百度学术中相似的文章
[Chen, Yanming]的文章
[Wu, Gang]的文章
[Shuai, Mingrui]的文章
必应学术
必应学术中相似的文章
[Chen, Yanming]的文章
[Wu, Gang]的文章
[Shuai, Mingrui]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。