CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
Why Dataset Properties Bound the Scalability of Parallel Machine Learning Training Algorithms
Cheng, Daning1; Li, Shigang2; Zhang, Hanping3; Xia, Fen3; Zhang, Yunquan4
2021-07-01
发表期刊IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS
ISSN1045-9219
卷号32期号:7页码:1702-1712
摘要As the training dataset size and the model size of machine learning increase rapidly, more computing resources are consumed to speedup the training process. However, the scalability and performance reproducibility of parallel machine learning training, which mainly uses stochastic optimization algorithms, are limited. In this paper, we demonstrate that the sample difference in the dataset plays a prominent role in the scalability of parallel machine learning algorithms. We propose to use statistical properties of dataset to measure sample differences. These properties include the variance of sample features, sample sparsity, sample diversity, and similarity in sampling sequences. We choose four types of parallel training algorithms as our research objects: (1) the asynchronous parallel SGD algorithm (Hogwild! algorithm), (2) the parallel model average SGD algorithm (minibatch SGD algorithm), (3) the decentralization optimization algorithm, and (4) the dual coordinate optimization (DADM algorithm). Our results show that the statistical properties of training datasets determine the scalability upper bound of these parallel training algorithms.
关键词Training Scalability Machine learning Machine learning algorithms Stochastic processes Task analysis Upper bound Parallel training algorithms training dataset scalability stochastic optimization methods
DOI10.1109/TPDS.2020.3048836
收录类别SCI
语种英语
资助项目National Natural Science Foundation of China[61972376] ; National Natural Science Foundation of China[61502450] ; National Natural Science Foundation of China[61432018] ; National Natural Science Foundation of China[61521092] ; National Key Research and Development Program of China[2016YFB0200800] ; National Key Research and Development Program of China[2016YFB0200803] ; National Key Research and Development Program of China[2017YFB0202302] ; National Key Research and Development Program of China[2017YFB0202105] ; State Key Laboratory of Computer Architecture Foundation[CARCH3504] ; Natural Science Foundation of Beijing[L182053]
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS记录号WOS:000621405200017
出版者IEEE COMPUTER SOC
引用统计
被引频次:12[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/16911
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Zhang, Yunquan
作者单位1.Chinese Acad Sci, Inst Comp Technol, SKL, Beijing, Peoples R China
2.Swiss Fed Inst Technol, Dept Comp Sci, Zh, Switzerland
3.Beijing Wisdom Uranium Technol Co Ltd, Algorithm Dept, Beijing, Peoples R China
4.Chinese Acad Sci, Inst Comp Technol, SKL Comp Architecture, Beijing, Peoples R China
推荐引用方式
GB/T 7714
Cheng, Daning,Li, Shigang,Zhang, Hanping,et al. Why Dataset Properties Bound the Scalability of Parallel Machine Learning Training Algorithms[J]. IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS,2021,32(7):1702-1712.
APA Cheng, Daning,Li, Shigang,Zhang, Hanping,Xia, Fen,&Zhang, Yunquan.(2021).Why Dataset Properties Bound the Scalability of Parallel Machine Learning Training Algorithms.IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS,32(7),1702-1712.
MLA Cheng, Daning,et al."Why Dataset Properties Bound the Scalability of Parallel Machine Learning Training Algorithms".IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS 32.7(2021):1702-1712.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Cheng, Daning]的文章
[Li, Shigang]的文章
[Zhang, Hanping]的文章
百度学术
百度学术中相似的文章
[Cheng, Daning]的文章
[Li, Shigang]的文章
[Zhang, Hanping]的文章
必应学术
必应学术中相似的文章
[Cheng, Daning]的文章
[Li, Shigang]的文章
[Zhang, Hanping]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。