论文标题

FedCluster:通过集群循环增强联邦学习的收敛性

FedCluster: Boosting the Convergence of Federated Learning via Cluster-Cycling

论文作者

Chen, Cheng, Chen, Ziyi, Zhou, Yi, Kailkhura, Bhavya

论文摘要

我们开发了FedCluster,这是一个新型联合学习框架,具有提高的优化效率,并研究了其理论收敛属性。 FedCluster将设备分为多个集群,这些群集在每个学习回合中周期性地进行联合学习。因此,每一个学习回合的综合集团都由多个元更新的循环组成,这些周期可以提高整体收敛性。在非convex优化中,我们显示,使用局部{随机梯度下降(SGD)}算法实现的设备的FedCluster比常规{Federated平均(FedAvg)}算法在设备设备-DELEVELEVEL DATELIVEL数据异质性的情况下实现了更快的收敛速率。我们对深度学习应用进行实验,并证明,在各种局部优化器中,在不同级别的设备级数据异质性下,FedCluster的收敛速度明显快于常规联合学习的速度要快。

We develop FedCluster--a novel federated learning framework with improved optimization efficiency, and investigate its theoretical convergence properties. The FedCluster groups the devices into multiple clusters that perform federated learning cyclically in each learning round. Therefore, each learning round of FedCluster consists of multiple cycles of meta-update that boost the overall convergence. In nonconvex optimization, we show that FedCluster with the devices implementing the local {stochastic gradient descent (SGD)} algorithm achieves a faster convergence rate than the conventional {federated averaging (FedAvg)} algorithm in the presence of device-level data heterogeneity. We conduct experiments on deep learning applications and demonstrate that FedCluster converges significantly faster than the conventional federated learning under diverse levels of device-level data heterogeneity for a variety of local optimizers.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源