FedNova: Normalized Averaging for Heterogeneous Systems
Was this section helpful?
Communication-Efficient Learning of Deep Networks from Decentralized Data, H. Brendan McMahan, Eider Moore, Daniel Ramage, Seth Hampson, and Blaise Aguera y Arcas, 2017Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS), Vol. 54 (PMLR)DOI: 10.48550/arXiv.1602.05629 - This foundational paper introduced Federated Averaging (FedAvg), a benchmark algorithm for communication-efficient federated learning, which serves as a baseline for methods addressing heterogeneity.
Federated Learning: Challenges, Methods, and Future Directions, Qiang Yang, Yuanshun Yao, Tian Li, Mao Yang, S. Yu Philip, Jian Li, Cong Shen, Lixin Fan, 2021IEEE Transactions on Big Data, Vol. 7 (IEEE)DOI: 10.1109/TBDATA.2021.3075727 - A comprehensive survey of federated learning, covering its definition, architecture, various challenges including systems heterogeneity, and discussing current methods and future research directions in the field.
When Does Federated Averaging Not Benefit From More Local Steps?, Jianyu Wang and Gauri Joshi, 2021Proceedings of the 24th International Conference on Artificial Intelligence and Statistics (AISTATS 2021), Vol. 130 (PMLR)DOI: 10.1609/aistats.v24i1.18978 - Provides a theoretical analysis of how the number of local steps affects the convergence of Federated Averaging, shedding light on the underlying issues that FedNova aims to address for improved performance in heterogeneous settings.