论文标题

关于用于健康监测的沟通效率联合学习的设计

On the Design of Communication-Efficient Federated Learning for Health Monitoring

论文作者

Chu, Dong, Jaafar, Wael, Yanikomeroglu, Halim

论文摘要

随着物联网的蓬勃发展,健康监测应用程序逐渐繁荣。在最近的Covid-19-19大流行状况中,对永久远程健康监测解决方案的兴趣已提高,以减少接触并保留有限的医疗资源。在实现有效远程健康监测的技术方法中,联邦学习(FL)由于其在保留数据隐私方面的稳健性而引起了人们的特别关注。但是,由于FL服务器与客户之间的频繁传输,FL可以屈服于高通信成本。为了解决这个问题,我们在本文中提出了一个涉及客户聚类和转移学习的沟通效率的联合学习(CEFL)框架。首先,我们建议通过基于神经网络特征的相似性因素来分组客户。然后,选择每个集群中的代表客户端是集群的领导者。与常规FL不同,我们的方法仅在集群领导者中进行FL训练。随后,领导者采用了转移学习,以通过训练有素的FL模型更新其集群成员。最后,每个成员都用自己的数据微调收到的模型。为了进一步降低沟通成本,我们选择了部分层FL聚合方法。该方法建议部分更新神经网络模型,而不是完全更新。通过实验,我们表明CEFL可以节省高达98.45%的通信成本,同时与常规FL相比,准确性损失少于3%。最后,CEFL对于具有小型或不平衡数据集的客户展示了很高的精度。

With the booming deployment of Internet of Things, health monitoring applications have gradually prospered. Within the recent COVID-19 pandemic situation, interest in permanent remote health monitoring solutions has raised, targeting to reduce contact and preserve the limited medical resources. Among the technological methods to realize efficient remote health monitoring, federated learning (FL) has drawn particular attention due to its robustness in preserving data privacy. However, FL can yield to high communication costs, due to frequent transmissions between the FL server and clients. To tackle this problem, we propose in this paper a communication-efficient federated learning (CEFL) framework that involves clients clustering and transfer learning. First, we propose to group clients through the calculation of similarity factors, based on the neural networks characteristics. Then, a representative client in each cluster is selected to be the leader of the cluster. Differently from the conventional FL, our method performs FL training only among the cluster leaders. Subsequently, transfer learning is adopted by the leader to update its cluster members with the trained FL model. Finally, each member fine-tunes the received model with its own data. To further reduce the communication costs, we opt for a partial-layer FL aggregation approach. This method suggests partially updating the neural network model rather than fully. Through experiments, we show that CEFL can save up to to 98.45% in communication costs while conceding less than 3% in accuracy loss, when compared to the conventional FL. Finally, CEFL demonstrates a high accuracy for clients with small or unbalanced datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源