CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
Learning Critically: Selective Self-Distillation in Federated Learning on Non-IID Data
He, Yuting1; Chen, Yiqiang2,3; Yang, XiaoDong2,3; Yu, Hanchao4; Huang, Yi-Hua1; Gu, Yang2
2024-12-01
发表期刊IEEE TRANSACTIONS ON BIG DATA
ISSN2332-7790
卷号10期号:6页码:789-800
摘要Federated learning (FL) enables multiple clients to collaboratively train a global model while keeping local data decentralized. Data heterogeneity (non-IID) across clients has imposed significant challenges to FL, which makes local models re-optimize towards their own local optima and forget the global knowledge, resulting in performance degradation and convergence slowdown. Many existing works have attempted to address the non-IID issue by adding an extra global-model-based regularizing item to the local training but without an adaption scheme, which is not efficient enough to achieve high performance with deep learning models. In this paper, we propose a Selective Self-Distillation method for Federated learning (FedSSD), which imposes adaptive constraints on the local updates by self-distilling the global model's knowledge and selectively weighting it by evaluating the credibility at both the class and sample level. The convergence guarantee of FedSSD is theoretically analyzed and extensive experiments are conducted on three public benchmark datasets, which demonstrates that FedSSD achieves better generalization and robustness in fewer communication rounds, compared with other state-of-the-art FL methods.
关键词Data models Training Servers Collaborative work Adaptation models Convergence Feature extraction Federated learning knowledge distillation non-identically distributed deep learning catastrophic forgetting
DOI10.1109/TBDATA.2022.3189703
收录类别SCI
语种英语
资助项目National Key Research and Development Plan of China[2021YFC2501202] ; Natural Science Foundation of China[61972383] ; Natural Science Foundation of China[61902377] ; Beijing Municipal Science and Technology Commission[Z211100002121171] ; Jinan ST Bureau[2020GXRC030] ; Youth Innovation Promotion Association CAS ; Science, and Technology Service Network Initiative, Chinese Academy of Sciences[KFJ-STS-QYZD-2021-11-001]
WOS研究方向Computer Science
WOS类目Computer Science, Information Systems ; Computer Science, Theory & Methods
WOS记录号WOS:001354646300019
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
引用统计
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/39488
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Chen, Yiqiang
作者单位1.Univ Chinese Acad Sci, Chinese Acad Sci, Inst Comp Technol, Beijing 100190, Peoples R China
2.Chinese Acad Sci, Inst Comp Technol, Beijing 100190, Peoples R China
3.Shandong Acad Intelligent Comp Technol, Jinan 250101, Peoples R China
4.Chinese Acad Sci, Frontier Sci & Educ, Beijing 100864, Peoples R China
推荐引用方式
GB/T 7714
He, Yuting,Chen, Yiqiang,Yang, XiaoDong,et al. Learning Critically: Selective Self-Distillation in Federated Learning on Non-IID Data[J]. IEEE TRANSACTIONS ON BIG DATA,2024,10(6):789-800.
APA He, Yuting,Chen, Yiqiang,Yang, XiaoDong,Yu, Hanchao,Huang, Yi-Hua,&Gu, Yang.(2024).Learning Critically: Selective Self-Distillation in Federated Learning on Non-IID Data.IEEE TRANSACTIONS ON BIG DATA,10(6),789-800.
MLA He, Yuting,et al."Learning Critically: Selective Self-Distillation in Federated Learning on Non-IID Data".IEEE TRANSACTIONS ON BIG DATA 10.6(2024):789-800.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[He, Yuting]的文章
[Chen, Yiqiang]的文章
[Yang, XiaoDong]的文章
百度学术
百度学术中相似的文章
[He, Yuting]的文章
[Chen, Yiqiang]的文章
[Yang, XiaoDong]的文章
必应学术
必应学术中相似的文章
[He, Yuting]的文章
[Chen, Yiqiang]的文章
[Yang, XiaoDong]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。