Institute of Computing Technology, Chinese Academy IR
Fast and accurate variable batch size convolution neural network training on large scale distributed systems | |
Hu, Zhongzhe1,2; Xiao, Junmin1; Sun, Ninghui1; Tan, Guangming1 | |
2022-06-06 | |
发表期刊 | CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE |
ISSN | 1532-0626 |
页码 | 26 |
摘要 | Large-scale distributed convolution neural network (CNN) training brings two performance challenges: model performance and system performance. Large batch size usually leads to model test accuracy loss, which counteracts the benefits of parallel SGD. The existing solutions require massive hyperparameter hand-tuning. To overcome this difficult, we analyze the training process and find that earlier training stages are more sensitive to batch size. Accordingly, we assert that different stages should use different batch size, and propose a variable batch size strategy. In order to remain high test accuracy under larger batch size cases, we design an auto-tuning engine for automatic parameter tuning in the proposed variable batch size strategy. Furthermore, we develop a dataflow implementation approach to achieve the high-throughput CNN training on supercomputer system. Our approach has achieved high generalization performance on SOAT CNN networks. For the ShuffleNet, ResNet-50, and ResNet-101 training with ImageNet-1K dataset, we scale the batch size to 120 K without accuracy loss and to 128 K with only a slight loss. And the dataflow implementation approach achieves 93.5% scaling efficiency on 1024 GPUs compared with the state-of-the-art. |
关键词 | deep learning distributed computing ImageNet-1K large-batch training synchronous SGD |
DOI | 10.1002/cpe.7119 |
收录类别 | SCI |
语种 | 英语 |
WOS研究方向 | Computer Science |
WOS类目 | Computer Science, Software Engineering ; Computer Science, Theory & Methods |
WOS记录号 | WOS:000806476800001 |
出版者 | WILEY |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://119.78.100.204/handle/2XEOYT63/19601 |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Hu, Zhongzhe |
作者单位 | 1.Chinese Acad Sci, Inst Comp Technol, 6 Kexueyuan South Rd, Beijing, Peoples R China 2.Univ Chinese Acad Sci, Beijing, Peoples R China |
推荐引用方式 GB/T 7714 | Hu, Zhongzhe,Xiao, Junmin,Sun, Ninghui,et al. Fast and accurate variable batch size convolution neural network training on large scale distributed systems[J]. CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE,2022:26. |
APA | Hu, Zhongzhe,Xiao, Junmin,Sun, Ninghui,&Tan, Guangming.(2022).Fast and accurate variable batch size convolution neural network training on large scale distributed systems.CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE,26. |
MLA | Hu, Zhongzhe,et al."Fast and accurate variable batch size convolution neural network training on large scale distributed systems".CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE (2022):26. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论