CASIA OpenIR  > 学术期刊  > Machine Intelligence Research
Overhead-free Noise-tolerant Federated Learning: A New Baseline
Shiyi Lin1; Deming Zhai1; Feilong Zhang1; Junjun Jiang1; Xianming Liu1; Xiangyang Ji2
Source PublicationMachine Intelligence Research
ISSN2731-538X
2024
Volume21Issue:3Pages:526-537
AbstractFederated learning (FL) is a promising decentralized machine learning approach that enables multiple distributed clients to train a model jointly while keeping their data private. However, in real-world scenarios, the supervised training data stored in local clients inevitably suffer from imperfect annotations, resulting in subjective, inconsistent and biased labels. These noisy labels can harm the collaborative aggregation process of FL by inducing inconsistent decision boundaries. Unfortunately, few attempts have been made towards noise-tolerant federated learning, with most of them relying on the strategy of transmitting overhead messages to assist noisy labels detection and correction, which increases the communication burden as well as privacy risks. In this paper, we propose a simple yet effective method for noise-tolerant FL based on the well-established co-training framework. Our method leverages the inherent discrepancy in the learning ability of the local and global models in FL, which can be regarded as two complementary views. By iteratively exchanging samples with their high confident predictions, the two models “teach each other” to suppress the influence of noisy labels. The proposed scheme enjoys the benefit of overhead cost-free and can serve as a robust and efficient baseline for noise-tolerant federated learning. Experimental results demonstrate that our method outperforms existing approaches, highlighting the superiority of our method.
KeywordFederated learning, noise-label learning, privacy-preserving machine learning, edge intelligence, distributed machine learning
DOI10.1007/s11633-023-1449-1
Citation statistics
Document Type期刊论文
Identifierhttp://ir.ia.ac.cn/handle/173211/56480
Collection学术期刊_Machine Intelligence Research
Affiliation1.Department of Computer Science and Technology, Harbin Institute of Technology, Harbin 150000, China
2.Department of Automation, Tsinghua University, Beijing 100084, China
Recommended Citation
GB/T 7714
Shiyi Lin,Deming Zhai,Feilong Zhang,et al. Overhead-free Noise-tolerant Federated Learning: A New Baseline[J]. Machine Intelligence Research,2024,21(3):526-537.
APA Shiyi Lin,Deming Zhai,Feilong Zhang,Junjun Jiang,Xianming Liu,&Xiangyang Ji.(2024).Overhead-free Noise-tolerant Federated Learning: A New Baseline.Machine Intelligence Research,21(3),526-537.
MLA Shiyi Lin,et al."Overhead-free Noise-tolerant Federated Learning: A New Baseline".Machine Intelligence Research 21.3(2024):526-537.
Files in This Item: Download All
File Name/Size DocType Version Access License
MIR-2023-03-027.pdf(1816KB)期刊论文出版稿开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Shiyi Lin]'s Articles
[Deming Zhai]'s Articles
[Feilong Zhang]'s Articles
Baidu academic
Similar articles in Baidu academic
[Shiyi Lin]'s Articles
[Deming Zhai]'s Articles
[Feilong Zhang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Shiyi Lin]'s Articles
[Deming Zhai]'s Articles
[Feilong Zhang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: MIR-2023-03-027.pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.