Institute of Computing Technology, Chinese Academy IR
Why Dataset Properties Bound the Scalability of Parallel Machine Learning Training Algorithms | |
Cheng, Daning1; Li, Shigang2; Zhang, Hanping3; Xia, Fen3; Zhang, Yunquan4 | |
2021-07-01 | |
发表期刊 | IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS |
ISSN | 1045-9219 |
卷号 | 32期号:7页码:1702-1712 |
摘要 | As the training dataset size and the model size of machine learning increase rapidly, more computing resources are consumed to speedup the training process. However, the scalability and performance reproducibility of parallel machine learning training, which mainly uses stochastic optimization algorithms, are limited. In this paper, we demonstrate that the sample difference in the dataset plays a prominent role in the scalability of parallel machine learning algorithms. We propose to use statistical properties of dataset to measure sample differences. These properties include the variance of sample features, sample sparsity, sample diversity, and similarity in sampling sequences. We choose four types of parallel training algorithms as our research objects: (1) the asynchronous parallel SGD algorithm (Hogwild! algorithm), (2) the parallel model average SGD algorithm (minibatch SGD algorithm), (3) the decentralization optimization algorithm, and (4) the dual coordinate optimization (DADM algorithm). Our results show that the statistical properties of training datasets determine the scalability upper bound of these parallel training algorithms. |
关键词 | Training Scalability Machine learning Machine learning algorithms Stochastic processes Task analysis Upper bound Parallel training algorithms training dataset scalability stochastic optimization methods |
DOI | 10.1109/TPDS.2020.3048836 |
收录类别 | SCI |
语种 | 英语 |
资助项目 | National Natural Science Foundation of China[61972376] ; National Natural Science Foundation of China[61502450] ; National Natural Science Foundation of China[61432018] ; National Natural Science Foundation of China[61521092] ; National Key Research and Development Program of China[2016YFB0200800] ; National Key Research and Development Program of China[2016YFB0200803] ; National Key Research and Development Program of China[2017YFB0202302] ; National Key Research and Development Program of China[2017YFB0202105] ; State Key Laboratory of Computer Architecture Foundation[CARCH3504] ; Natural Science Foundation of Beijing[L182053] |
WOS研究方向 | Computer Science ; Engineering |
WOS类目 | Computer Science, Theory & Methods ; Engineering, Electrical & Electronic |
WOS记录号 | WOS:000621405200017 |
出版者 | IEEE COMPUTER SOC |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://119.78.100.204/handle/2XEOYT63/16911 |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Zhang, Yunquan |
作者单位 | 1.Chinese Acad Sci, Inst Comp Technol, SKL, Beijing, Peoples R China 2.Swiss Fed Inst Technol, Dept Comp Sci, Zh, Switzerland 3.Beijing Wisdom Uranium Technol Co Ltd, Algorithm Dept, Beijing, Peoples R China 4.Chinese Acad Sci, Inst Comp Technol, SKL Comp Architecture, Beijing, Peoples R China |
推荐引用方式 GB/T 7714 | Cheng, Daning,Li, Shigang,Zhang, Hanping,et al. Why Dataset Properties Bound the Scalability of Parallel Machine Learning Training Algorithms[J]. IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS,2021,32(7):1702-1712. |
APA | Cheng, Daning,Li, Shigang,Zhang, Hanping,Xia, Fen,&Zhang, Yunquan.(2021).Why Dataset Properties Bound the Scalability of Parallel Machine Learning Training Algorithms.IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS,32(7),1702-1712. |
MLA | Cheng, Daning,et al."Why Dataset Properties Bound the Scalability of Parallel Machine Learning Training Algorithms".IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS 32.7(2021):1702-1712. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论