CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
WP-SGD: Weighted parallel SGD for distributed unbalanced-workload training system
Cheng Daning1,2; Li Shigang1,3; Zhang Yunquan1
2020-11-01
发表期刊JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING
ISSN0743-7315
卷号145页码:202-216
摘要Stochastic gradient descent (SGD) is a popular stochastic optimization method in machine learning. Traditional parallel SGD algorithms, e.g., SimuParallel SGD (Zinkevich, 2010), often require all nodes to have the same performance or to consume equal quantities of data. However, these requirements are difficult to satisfy when the parallel SGD algorithms run in a heterogeneous computing environment; low-performance nodes will exert a negative influence on the final result. In this paper, we propose an algorithm called weighted parallel SGD (WP-SGD). WP-SGD combines weighted model parameters from different nodes in the system to produce the final output. WP-SGD makes use of the reduction in standard deviation to compensate for the loss from the inconsistency in performance of nodes in the cluster, which means that WP-SGD does not require that all nodes consume equal quantities of data. We also propose the methods of running two other parallel SGD algorithms combined with WP-SGD in a heterogeneous environment. The experimental results show that WP-SGD significantly outperforms the traditional parallel SGD algorithms on distributed training systems with an unbalanced workload. (C) 2020 Elsevier Inc. All rights reserved.
关键词SGD Unbalanced workload SimuParallel SGD Distributed system
DOI10.1016/j.jpdc.2020.06.011
收录类别SCI
语种英语
资助项目National Key R&D Program of China[2016YFB0200803] ; National Key R&D Program of China[2017YFB0202302] ; National Key R&D Program of China[2017YFB0202001] ; National Key R&D Program of China[2017YFB0202502] ; National Key R&D Program of China[2017YFB0202105] ; National Key R&D Program of China[2018YFB0704002] ; National Key R&D Program of China[2018YFC0809306] ; Strategic Priority Research Program of Chinese Academy of Sciences[XDC01000000] ; National Natural Science Foundation of China[61972376] ; National Natural Science Foundation of China[61502450] ; National Natural Science Foundation of China[61432018] ; National Natural Science Foundation of China[61521092] ; Science Foundation of Beijing[L182053] ; SKL of Computer Architecture Foundation[CARCH3504]
WOS研究方向Computer Science
WOS类目Computer Science, Theory & Methods
WOS记录号WOS:000568803300015
出版者ACADEMIC PRESS INC ELSEVIER SCIENCE
引用统计
被引频次:3[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/15557
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Li Shigang
作者单位1.Chinese Acad Sci, Inst Comp Technol, SKL Comp Architecture, Beijing, Peoples R China
2.Univ Chinese Acad Sci, Beijing, Peoples R China
3.Swiss Fed Inst Technol, Dept Comp Sci, Zurich, Switzerland
推荐引用方式
GB/T 7714
Cheng Daning,Li Shigang,Zhang Yunquan. WP-SGD: Weighted parallel SGD for distributed unbalanced-workload training system[J]. JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING,2020,145:202-216.
APA Cheng Daning,Li Shigang,&Zhang Yunquan.(2020).WP-SGD: Weighted parallel SGD for distributed unbalanced-workload training system.JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING,145,202-216.
MLA Cheng Daning,et al."WP-SGD: Weighted parallel SGD for distributed unbalanced-workload training system".JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING 145(2020):202-216.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Cheng Daning]的文章
[Li Shigang]的文章
[Zhang Yunquan]的文章
百度学术
百度学术中相似的文章
[Cheng Daning]的文章
[Li Shigang]的文章
[Zhang Yunquan]的文章
必应学术
必应学术中相似的文章
[Cheng Daning]的文章
[Li Shigang]的文章
[Zhang Yunquan]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。