Federated Learning (FL) has emerged as a means of distributed learning using local data stored at clients with a coordinating server. Recent studies showed that FL can suffer from poor performance and slower convergence when training data at the clients are not independent and identically distributed (IID). Here, we consider auxiliary server learning as a approach to improving the performance of FL on non-IID data. Our analysis and experiments show that this approach can achieve significant improvements in both model accuracy and convergence time even when the dataset utilized by the server is small and its distribution differs from that of the clients' aggregate data. Moreover, experimental results suggest that auxiliary server learning delivers benefits when employed together with other techniques proposed to mitigate the performance degradation of FL on non-IID data.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11523028 | PMC |
http://dx.doi.org/10.1109/tai.2024.3430250 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!