论文标题

在常识知识图上具有多跳上推理的语言生成图

Language Generation with Multi-Hop Reasoning on Commonsense Knowledge Graph

论文作者

Ji, Haozhe, Ke, Pei, Huang, Shaohan, Wei, Furu, Zhu, Xiaoyan, Huang, Minlie

论文摘要

尽管生成的预训练的语言模型在一系列文本生成任务上取得了成功,但在生成过程中需要对基础常识性知识进行推理的情况下,它们仍然受苦。现有的方法将常识性知识整合到生成的预训练的语言模型中,只需通过对个人知识的培训进行培训,而忽略知识图中的丰富联系来转移关系知识。我们认为,利用知识图的结构和语义信息促进了常识感知文本的生成。在本文中,我们提出了多跳推理流(GRF)的生成,该生成能够在从外部共识知识图中提取的多关系路径上具有动态多跳跃推理的预训练模型。我们从经验上表明,我们的模型在三个需要对常识性知识推理的文本生成任务上优于现有基准。我们还展示了动态多跳跃推理模块的有效性,并通过为生成提供理由的模型推断出的推理路径。

Despite the success of generative pre-trained language models on a series of text generation tasks, they still suffer in cases where reasoning over underlying commonsense knowledge is required during generation. Existing approaches that integrate commonsense knowledge into generative pre-trained language models simply transfer relational knowledge by post-training on individual knowledge triples while ignoring rich connections within the knowledge graph. We argue that exploiting both the structural and semantic information of the knowledge graph facilitates commonsense-aware text generation. In this paper, we propose Generation with Multi-Hop Reasoning Flow (GRF) that enables pre-trained models with dynamic multi-hop reasoning on multi-relational paths extracted from the external commonsense knowledge graph. We empirically show that our model outperforms existing baselines on three text generation tasks that require reasoning over commonsense knowledge. We also demonstrate the effectiveness of the dynamic multi-hop reasoning module with reasoning paths inferred by the model that provide rationale to the generation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源