In:
INFORMS Journal on Computing, Institute for Operations Research and the Management Sciences (INFORMS), ( 2023-8-29)
Abstract:
Federated learning is a new distributed machine learning framework, where numerous heterogeneous clients collaboratively train a model without sharing training data. In this work, we consider a practical and ubiquitous issue when deploying federated learning in mobile environments: intermittent client availability, where the set of eligible clients may change during the training process. Such intermittent client availability would seriously dete-riorate the performance of the classical federated averaging algorithm (FedAvg). Thus, we propose a simple distributed nonconvex optimization algorithm, called federated latest aver-aging (FedLaAvg), which leverages the latest gradients of all clients, even when the clients are not available, to jointly update the global model in each iteration. Our theoretical analysis shows that FedLaAvg achieves guaranteed convergence and a sublinear speedup with respect to the total number of clients. We implement FedLaAvg along with several baselines and evaluate them over the benchmarking MNIST and Sentiment140 data sets. The evalua-tion results demonstrate that FedLaAvg achieves more stable training than FedAvg in both convex and nonconvex settings and reaches a sublinear speedup.
Type of Medium:
Online Resource
ISSN:
1091-9856
,
1526-5528
DOI:
10.1287/ijoc.2022.0057.cd
Language:
Unknown
Publisher:
Institute for Operations Research and the Management Sciences (INFORMS)
Publication Date:
2023
detail.hit.zdb_id:
2070411-2
detail.hit.zdb_id:
2004082-9
SSG:
3,2
Permalink