论文标题

NLG中数据有效建模的最佳实践:如何使用较少的数据训练生产就绪的神经模型

Best Practices for Data-Efficient Modeling in NLG:How to Train Production-Ready Neural Models with Less Data

论文作者

Arun, Ankit, Batra, Soumya, Bhardwaj, Vikas, Challa, Ashwini, Donmez, Pinar, Heidari, Peyman, Inan, Hakan, Jain, Shashank, Kumar, Anuj, Mei, Shawn, Mohan, Karthik, White, Michael

论文摘要

自然语言产生(NLG)是对话系统中的关键组成部分,因为它的作用是制定正确和自然的文本响应。传统上,使用基于模板的解决方案部署了NLG组件。尽管最近在研究界开发的神经网络解决方案已被证明可以带来一些好处,但由于高潜伏期,正确的问题和高数据需求,因此基于模型的解决方案的部署一直在挑战。在本文中,我们介绍了帮助我们在会话系统中为NLG部署数据有效的神经解决方案的方法。我们描述了一种采样和建模技术的家族,以使用轻量的神经网络模型来实现生产质量,仅使用一小部分数据,这些数据是必要的,并在每个数据之间进行详尽的比较。我们的结果表明,域复杂性决定了实现高数据效率的适当方法。最后,我们将经验教训从实验发现中提炼成生产级NLG模型开发的最佳实践列表,并在简短的运行簿中展示。重要的是,所有技术的最终产品都是小序列到序列模型(2MB),我们可以可靠地部署在生产中。

Natural language generation (NLG) is a critical component in conversational systems, owing to its role of formulating a correct and natural text response. Traditionally, NLG components have been deployed using template-based solutions. Although neural network solutions recently developed in the research community have been shown to provide several benefits, deployment of such model-based solutions has been challenging due to high latency, correctness issues, and high data needs. In this paper, we present approaches that have helped us deploy data-efficient neural solutions for NLG in conversational systems to production. We describe a family of sampling and modeling techniques to attain production quality with light-weight neural network models using only a fraction of the data that would be necessary otherwise, and show a thorough comparison between each. Our results show that domain complexity dictates the appropriate approach to achieve high data efficiency. Finally, we distill the lessons from our experimental findings into a list of best practices for production-level NLG model development, and present them in a brief runbook. Importantly, the end products of all of the techniques are small sequence-to-sequence models (2Mb) that we can reliably deploy in production.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源