论文标题

循证知识的推论文本生成具有矢量量化的变分自动编码器

Evidence-Aware Inferential Text Generation with Vector Quantised Variational AutoEncoder

论文作者

Guo, Daya, Tang, Duyu, Duan, Nan, Yin, Jian, Jiang, Daxin, Zhou, Ming

论文摘要

从不同角度生成有关事件的推论文本需要在事件发生的不同上下文上进行推理。现有作品通常忽略了未明确提供的上下文,从而产生了与上下文无关的语义表示,该表示努力支持这一生成一代。为了解决这个问题,我们提出了一种方法,该方法会自动从大型文本语料库中找到事件的证据,并利用证据来指导推论文本的产生。我们的方法以编码器方式进行工作,并配备了矢量量化变量自动编码器,其中编码器在离散变量上从分布中输出表示表示。这种离散表示能够自动选择相关证据,这不仅促进了证据感知的一代,而且还提供了一种自然的方式来发现一代背后的理由。我们的方法在Event2Mind和Atomic数据集上提供了最先进的性能。更重要的是,我们发现,通过离散表示,我们的模型有选择地使用证据来生成不同的推论文本。

Generating inferential texts about an event in different perspectives requires reasoning over different contexts that the event occurs. Existing works usually ignore the context that is not explicitly provided, resulting in a context-independent semantic representation that struggles to support the generation. To address this, we propose an approach that automatically finds evidence for an event from a large text corpus, and leverages the evidence to guide the generation of inferential texts. Our approach works in an encoder-decoder manner and is equipped with a Vector Quantised-Variational Autoencoder, where the encoder outputs representations from a distribution over discrete variables. Such discrete representations enable automatically selecting relevant evidence, which not only facilitates evidence-aware generation, but also provides a natural way to uncover rationales behind the generation. Our approach provides state-of-the-art performance on both Event2Mind and ATOMIC datasets. More importantly, we find that with discrete representations, our model selectively uses evidence to generate different inferential texts.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源