论文标题
结构化变压器的逐步提取性摘要和计划
Stepwise Extractive Summarization and Planning with Structured Transformers
论文作者
论文摘要
我们建议使用结构化变压器(Hibert和扩展变压器)提出以编码器为中心的逐步模型。我们通过将先前生成的摘要作为辅助子结构注入到结构化的变压器中来逐步摘要。我们的模型不仅有效地建模了长输入的结构,而且它们也不依赖于特定于任务的冗余建模,从而使它们成为用于不同任务的通用提取内容计划者。当对CNN/Dailymail提取性摘要进行评估时,逐步模型在没有任何冗余意识到建模或句子过滤的情况下实现了最先进的性能。 Rotowire桌面到文本的生成也是如此,我们的模型超过了以前报道的用于选择,计划和订购的指标,突出了逐步建模的强度。在我们测试的两个结构化变压器中,逐步扩展变压器在两个数据集中提供了最佳性能,并为这些挑战设定了新的标准。
We propose encoder-centric stepwise models for extractive summarization using structured transformers -- HiBERT and Extended Transformers. We enable stepwise summarization by injecting the previously generated summary into the structured transformer as an auxiliary sub-structure. Our models are not only efficient in modeling the structure of long inputs, but they also do not rely on task-specific redundancy-aware modeling, making them a general purpose extractive content planner for different tasks. When evaluated on CNN/DailyMail extractive summarization, stepwise models achieve state-of-the-art performance in terms of Rouge without any redundancy aware modeling or sentence filtering. This also holds true for Rotowire table-to-text generation, where our models surpass previously reported metrics for content selection, planning and ordering, highlighting the strength of stepwise modeling. Amongst the two structured transformers we test, stepwise Extended Transformers provides the best performance across both datasets and sets a new standard for these challenges.