论文标题

联合拆分BERT用于异质文本分类

Federated Split BERT for Heterogeneous Text Classification

论文作者

Li, Zhengyang, Si, Shijing, Wang, Jianzong, Xiao, Jing

论文摘要

预先训练的BERT模型在许多自然语言处理(NLP)任务中取得了令人印象深刻的表现。但是,在许多现实情况下,由于隐私保护和法规,文本数据通常在许多客户端分散,并且无法上传到中央服务器。联合学习(FL)使多个客户可以协作培训全球模型,同时保持本地数据隐私。一些研究已经调查了联合学习环境中的BERT,但是对客户的异质性(例如,非IID)数据引起的绩效损失问题仍然不足。为了解决这个问题,我们提出了一个框架FedSplitbert,该框架可以通过将BERT编码层分成本地部分和全球部分来处理异质数据并降低通信成本。本地部分参数仅受本地客户端的培训,而全局零件参数通过汇总多个客户端的梯度培训。由于BERT的庞大大小,我们探索了一种量化方法,可以进一步降低沟通成本,而绩效损失最少。我们的框架即将使用,并且与许多现有的联邦学习算法(包括FedAvg,FedProx和Fedadam)兼容。我们的实验验证了所提出的框架的有效性,该框架的效率超过了基线方法,而使用量化的FedSplitbert可以将通信成本降低$ 11.9 \ times $ $。

Pre-trained BERT models have achieved impressive performance in many natural language processing (NLP) tasks. However, in many real-world situations, textual data are usually decentralized over many clients and unable to be uploaded to a central server due to privacy protection and regulations. Federated learning (FL) enables multiple clients collaboratively to train a global model while keeping the local data privacy. A few researches have investigated BERT in federated learning setting, but the problem of performance loss caused by heterogeneous (e.g., non-IID) data over clients remain under-explored. To address this issue, we propose a framework, FedSplitBERT, which handles heterogeneous data and decreases the communication cost by splitting the BERT encoder layers into local part and global part. The local part parameters are trained by the local client only while the global part parameters are trained by aggregating gradients of multiple clients. Due to the sheer size of BERT, we explore a quantization method to further reduce the communication cost with minimal performance loss. Our framework is ready-to-use and compatible to many existing federated learning algorithms, including FedAvg, FedProx and FedAdam. Our experiments verify the effectiveness of the proposed framework, which outperforms baseline methods by a significant margin, while FedSplitBERT with quantization can reduce the communication cost by $11.9\times$.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源