CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
FedICT: Federated Multi-Task Distillation for Multi-Access Edge Computing
Wu, Zhiyuan1,2; Sun, Sheng1; Wang, Yuwei1; Liu, Min1,3; Pan, Quyang1; Jiang, Xuefeng1,2; Gao, Bo4
2024-06-01
发表期刊IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS
ISSN1045-9219
卷号35期号:6页码:952-966
摘要The growing interest in intelligent services and privacy protection for mobile devices has given rise to the widespread application of federated learning in Multi-access Edge Computing (MEC). Diverse user behaviors call for personalized services with heterogeneous Machine Learning (ML) models on different devices. Federated Multi-task Learning (FMTL) is proposed to train related but personalized ML models for different devices, whereas previous works suffer from excessive communication overhead during training and neglect the model heterogeneity among devices in MEC. Introducing knowledge distillation into FMTL can simultaneously enable efficient communication and model heterogeneity among clients, whereas existing methods rely on a public dataset, which is impractical in reality. To tackle this dilemma, Federated MultI-task Distillation for Multi-access Edge CompuTing (FedICT) is proposed. FedICT direct local-global knowledge aloof during bi-directional distillation processes between clients and the server, aiming to enable multi-task clients while alleviating client drift derived from divergent optimization directions of client-side local models. Specifically, FedICT includes Federated Prior Knowledge Distillation (FPKD) and Local Knowledge Adjustment (LKA). FPKD is proposed to reinforce the clients' fitting of local data by introducing prior knowledge of local data distributions. Moreover, LKA is proposed to correct the distillation loss of the server, making the transferred local knowledge better match the generalized representation. Extensive experiments on three datasets demonstrate that FedICT significantly outperforms all compared benchmarks in various data heterogeneous and model architecture settings, achieving improved accuracy with less than 1.2% training communication overhead compared with FedAvg and no more than 75% training communication round compared with FedGKT in all considered scenarios.
关键词Computational modeling Data models Training Servers Multitasking Adaptation models Optimization Distributed optimization federated learning knowledge distillation multi-access edge computing multi-task learning
DOI10.1109/TPDS.2023.3289444
收录类别SCI
语种英语
资助项目National Key Research and Development Program of China
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS记录号WOS:001216308700001
出版者IEEE COMPUTER SOC
引用统计
被引频次:12[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/38997
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Wang, Yuwei
作者单位1.Chinese Acad Sci, Inst Comp Technol, Beijing 100190, Peoples R China
2.Univ Chinese Acad Sci, Beijing 101408, Peoples R China
3.Zhongguancun Lab, Beijing 100190, Peoples R China
4.Beijing Jiaotong Univ, Engn Res Ctr Network Management Technol High Speed, Sch Comp & Informat Technol, Minist Educ, Beijing 100044, Peoples R China
推荐引用方式
GB/T 7714
Wu, Zhiyuan,Sun, Sheng,Wang, Yuwei,et al. FedICT: Federated Multi-Task Distillation for Multi-Access Edge Computing[J]. IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS,2024,35(6):952-966.
APA Wu, Zhiyuan.,Sun, Sheng.,Wang, Yuwei.,Liu, Min.,Pan, Quyang.,...&Gao, Bo.(2024).FedICT: Federated Multi-Task Distillation for Multi-Access Edge Computing.IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS,35(6),952-966.
MLA Wu, Zhiyuan,et al."FedICT: Federated Multi-Task Distillation for Multi-Access Edge Computing".IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS 35.6(2024):952-966.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Wu, Zhiyuan]的文章
[Sun, Sheng]的文章
[Wang, Yuwei]的文章
百度学术
百度学术中相似的文章
[Wu, Zhiyuan]的文章
[Sun, Sheng]的文章
[Wang, Yuwei]的文章
必应学术
必应学术中相似的文章
[Wu, Zhiyuan]的文章
[Sun, Sheng]的文章
[Wang, Yuwei]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。