论文标题

解决异质联邦优化中的客观不一致问题

Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization

论文作者

Wang, Jianyu, Liu, Qinghua, Liang, Hao, Joshi, Gauri, Poor, H. Vincent

论文摘要

在联合优化中,客户端的本地数据集和计算速度的异质性导致每个客户在每个通信回合中执行的本地更新数量的差异很大。此类模型的幼稚加权聚集会导致客观上不一致,即,全局模型会收敛到不匹配的目标函数的固定点,这可能与真正的目标有任意不同。本文提供了一个一般框架来分析联合异质优化算法的收敛性。它涵盖了先前提出的方法,例如FedAvg和FedProx,并首先对解决方案偏差和由于客观上不一致引起的收敛速度放缓。使用此分析中的见解,我们提出了Fednova,这是一种归一化的平均方法,可以消除客观不一致,同时保留快速误差收敛。

In federated optimization, heterogeneity in the clients' local datasets and computation speeds results in large variations in the number of local updates performed by each client in each communication round. Naive weighted aggregation of such models causes objective inconsistency, that is, the global model converges to a stationary point of a mismatched objective function which can be arbitrarily different from the true objective. This paper provides a general framework to analyze the convergence of federated heterogeneous optimization algorithms. It subsumes previously proposed methods such as FedAvg and FedProx and provides the first principled understanding of the solution bias and the convergence slowdown due to objective inconsistency. Using insights from this analysis, we propose FedNova, a normalized averaging method that eliminates objective inconsistency while preserving fast error convergence.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源