论文标题

用于联合学习的多层分支正则化

Multi-Level Branched Regularization for Federated Learning

论文作者

Kim, Jinkyu, Kim, Geeho, Han, Bohyung

论文摘要

联邦学习的一个关键挑战是客户之间的数据异质性和不平衡,这导致本地网络与全球模型的不稳定融合之间的不一致。为了减轻局限性,我们提出了一种新型的体系结构正则化技术,该技术通过将本地和全球子网植入多个不同级别,在每个本地模型中构建了多个辅助分支,并通过在线知识蒸馏来了解与辅助混合途径的本地模型一致的主要途径表示。该提出的技术即使在非IID环境中也可以有效地鲁棒化的全球模型,并且适用于方便的各种联合学习框架,而不会产生额外的沟通成本。与现有方法相比,我们进行了全面的经验研究,并在准确性和效率方面表现出显着的性能提高。源代码可在我们的项目页面上找到。

A critical challenge of federated learning is data heterogeneity and imbalance across clients, which leads to inconsistency between local networks and unstable convergence of global models. To alleviate the limitations, we propose a novel architectural regularization technique that constructs multiple auxiliary branches in each local model by grafting local and global subnetworks at several different levels and that learns the representations of the main pathway in the local model congruent to the auxiliary hybrid pathways via online knowledge distillation. The proposed technique is effective to robustify the global model even in the non-iid setting and is applicable to various federated learning frameworks conveniently without incurring extra communication costs. We perform comprehensive empirical studies and demonstrate remarkable performance gains in terms of accuracy and efficiency compared to existing methods. The source code is available at our project page.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源