论文标题

乐团:通过全球一致的集群无监督的联邦学习

Orchestra: Unsupervised Federated Learning via Globally Consistent Clustering

论文作者

Lubana, Ekdeep Singh, Tang, Chi Ian, Kawsar, Fahim, Dick, Robert P., Mathur, Akhil

论文摘要

联合学习通常用于容易获得标签的任务(例如,下一个单词预测)。放松这种约束需要设计无监督的学习技术,该技术可以支持联合培训的理想特性:稳健性,对统计/系统异质性,可扩展性具有参与者的数量以及沟通效率。有关此主题的先前工作集中在直接扩展集中式的自我监督学习技术上,这些学习技术并非旨在具有上面列出的属性。为了解决这种情况,我们提出了乐团,这是一种新颖的无监督联合学习技术,利用联邦的层次结构来协调分布式的聚类任务,并将客户数据对客户数据的全球统一分配到可区分的群集中。我们显示管弦乐队中的算法管道可以确保在线性探针下保证良好的概括性能,从而使其在广泛的条件下胜过替代技术,包括异质性,客户次数,参与率和本地时期的变化。

Federated learning is generally used in tasks where labels are readily available (e.g., next word prediction). Relaxing this constraint requires design of unsupervised learning techniques that can support desirable properties for federated training: robustness to statistical/systems heterogeneity, scalability with number of participants, and communication efficiency. Prior work on this topic has focused on directly extending centralized self-supervised learning techniques, which are not designed to have the properties listed above. To address this situation, we propose Orchestra, a novel unsupervised federated learning technique that exploits the federation's hierarchy to orchestrate a distributed clustering task and enforce a globally consistent partitioning of clients' data into discriminable clusters. We show the algorithmic pipeline in Orchestra guarantees good generalization performance under a linear probe, allowing it to outperform alternative techniques in a broad range of conditions, including variation in heterogeneity, number of clients, participation ratio, and local epochs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源