论文标题

在本地思考,全球行动:与本地和全球表示的联合学习

Think Locally, Act Globally: Federated Learning with Local and Global Representations

论文作者

Liang, Paul Pu, Liu, Terrance, Ziyin, Liu, Allen, Nicholas B., Auerbach, Randy P., Brent, David, Salakhutdinov, Ruslan, Morency, Louis-Philippe

论文摘要

联合学习是一种培训模型的方法,该模型分布在多个设备上。为了保持设备数据的私有,全局模型仅通过通信参数和更新来培训,从而对大型模型构成可扩展性挑战。为此,我们提出了一种新的联合学习算法,该算法共同学习每个设备上紧凑的本地表示,以及在所有设备上的全局模型。结果,全局模型仅在局部表示上运行,从而减少了传达的参数数量。从理论上讲,我们提供了概括分析,该分析表明,局部和全局模型的组合降低了数据分布的数据差异以及差异。从经验上讲,我们证明本地模型可以在保持绩效的同时进行沟通效率培训。我们还评估了来自隐私是关键的现实世界移动数据的个性化情绪预测的任务。最后,本地模型处理来自新设备的异质数据,并学习混淆受保护属性(例如种族,年龄和性别)的公平表示。

Federated learning is a method of training models on private data distributed over multiple devices. To keep device data private, the global model is trained by only communicating parameters and updates which poses scalability challenges for large models. To this end, we propose a new federated learning algorithm that jointly learns compact local representations on each device and a global model across all devices. As a result, the global model can be smaller since it only operates on local representations, reducing the number of communicated parameters. Theoretically, we provide a generalization analysis which shows that a combination of local and global models reduces both variance in the data as well as variance across device distributions. Empirically, we demonstrate that local models enable communication-efficient training while retaining performance. We also evaluate on the task of personalized mood prediction from real-world mobile data where privacy is key. Finally, local models handle heterogeneous data from new devices, and learn fair representations that obfuscate protected attributes such as race, age, and gender.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源