论文标题

回到未来:多转向对话建模的双向信息解耦网络

Back to the Future: Bidirectional Information Decoupling Network for Multi-turn Dialogue Modeling

论文作者

Li, Yiyang, Zhao, Hai, Zhang, Zhuosheng

论文摘要

多转向对话建模是自然语言理解(NLU)的一个挑战性分支,旨在为机器建立代表,以了解人类对话,这为多个下游任务提供了坚实的基础。对对话建模的最新研究通常采用预训练的语言模型(PRLMS)将对话历史记录为连续的令牌,这不足以捕获对话的时间特征。因此,我们建议双向信息解耦网络(BIDEN)作为通用对话编码器,它明确地纳入了过去和将来的上下文,并且可以将其推广到与对话相关的一系列任务。在不同下游任务的数据集上的实验结果证明了我们竞标的普遍性和有效性。

Multi-turn dialogue modeling as a challenging branch of natural language understanding (NLU), aims to build representations for machines to understand human dialogues, which provides a solid foundation for multiple downstream tasks. Recent studies of dialogue modeling commonly employ pre-trained language models (PrLMs) to encode the dialogue history as successive tokens, which is insufficient in capturing the temporal characteristics of dialogues. Therefore, we propose Bidirectional Information Decoupling Network (BiDeN) as a universal dialogue encoder, which explicitly incorporates both the past and future contexts and can be generalized to a wide range of dialogue-related tasks. Experimental results on datasets of different downstream tasks demonstrate the universality and effectiveness of our BiDeN.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源