CSpace  > 中国科学院计算技术研究所期刊论文  > 英文
Federated Data Quality Assessment Approach: Robust Learning With Mixed Label Noise
Zeng, Bixiao1; Yang, Xiaodong1,2; Chen, Yiqiang1,3,4; Yu, Hanchao5; Hu, Chunyu6; Zhang, Yingwei1
2023-08-31
发表期刊IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN2162-237X
页码15
摘要Federated learning (FL) has been an effective way to train a machine learning model distributedly, holding local data without exchanging them. However, due to the inaccessibility of local data, FL with label noise would be more challenging. Most existing methods assume only open-set or closed-set noise and correspondingly propose filtering or correction solutions, ignoring that label noise can be mixed in real-world scenarios. In this article, we propose a novel FL method to discriminate the type of noise and make the FL mixed noise-robust, named FedMIN. FedMIN employs a composite framework that captures local-global differences in multiparticipant distributions to model generalized noise patterns. By determining adaptive thresholds for identifying mixed label noise in each client and assigning appropriate weights during model aggregation, FedMIN enhances the performance of the global model. Furthermore, FedMIN incorporates a loss alignment mechanism using local and global Gaussian mixture models (GMMs) to mitigate the risk of revealing samplewise loss. Extensive experiments are conducted on several public datasets, which include the simulated FL testbeds, i.e., CIFAR-10, CIFAR-100, and SVHN, and the real-world ones, i.e., Camelyon17 and multiorgan nuclei challenge (MoNuSAC). Compared to FL benchmarks, FedMIN improves model accuracy by up to 9.9% due to its superior noise estimation capabilities.
关键词Noise measurement Servers Task analysis Adaptation models Data models Data integrity Computers Data quality assessment federated learning (FL) noise-robust algorithm
DOI10.1109/TNNLS.2023.3306874
收录类别SCI
语种英语
资助项目National Key Research and Development Plan of China[2021YFC2501202] ; National Natural Science Foundation of China[62202455] ; National Natural Science Foundation of China[61972383] ; Beijing Municipal Science and Technology Commission[Z211100002121171] ; Beijing Municipal Science and Technology Commission[Z221100002722009] ; China Scholarship Council[202204910370]
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS记录号WOS:001060588000001
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
引用统计
被引频次:1[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://119.78.100.204/handle/2XEOYT63/21388
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Chen, Yiqiang
作者单位1.Chinese Acad Sci, Inst Comp Technol, Beijing 100045, Peoples R China
2.Shandong Acad Intelligent Comp Technol, Inst Comp Technol, Jinan, Peoples R China
3.Univ Chinese Acad Sci, Beijing 101408, Peoples R China
4.Beijing Key Lab Mobile Comp & Pervas Device, Beijing 100190, Peoples R China
5.Chinese Acad Sci, Bur Frontier Sci & Educ, Beijing 100045, Peoples R China
6.Qilu Univ Technol, Shandong Acad Sci, Jinan 250353, Peoples R China
推荐引用方式
GB/T 7714
Zeng, Bixiao,Yang, Xiaodong,Chen, Yiqiang,et al. Federated Data Quality Assessment Approach: Robust Learning With Mixed Label Noise[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2023:15.
APA Zeng, Bixiao,Yang, Xiaodong,Chen, Yiqiang,Yu, Hanchao,Hu, Chunyu,&Zhang, Yingwei.(2023).Federated Data Quality Assessment Approach: Robust Learning With Mixed Label Noise.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,15.
MLA Zeng, Bixiao,et al."Federated Data Quality Assessment Approach: Robust Learning With Mixed Label Noise".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2023):15.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Zeng, Bixiao]的文章
[Yang, Xiaodong]的文章
[Chen, Yiqiang]的文章
百度学术
百度学术中相似的文章
[Zeng, Bixiao]的文章
[Yang, Xiaodong]的文章
[Chen, Yiqiang]的文章
必应学术
必应学术中相似的文章
[Zeng, Bixiao]的文章
[Yang, Xiaodong]的文章
[Chen, Yiqiang]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。