论文标题

变形金刚是简短的文本分类器:基准和现实数据集上的归纳短文分类器的研究

Transformers are Short Text Classifiers: A Study of Inductive Short Text Classifiers on Benchmarks and Real-world Datasets

论文作者

Karl, Fabian, Scherp, Ansgar

论文摘要

简短的文本分类是自然语言处理的一个至关重要且具有挑战性的方面。因此,有许多高度专业的短文本分类器。但是,在最近的简短文本研究中,无法探索传统文本分类的最新技术(SOTA)方法,尤其是纯粹使用变压器的方法。在这项工作中,我们研究了各种简短的文本分类器以及表现最高的传统文本分类器的性能。我们进一步研究了对两个新的现实世界短文本数据集的影响,以解决过度依赖具有有限特征的基准数据集的问题。我们的实验明确地表明,变形金刚在短文本分类任务上实现了SOTA的准确性,提出了一个问题,即是否需要专业的短文本技术。

Short text classification is a crucial and challenging aspect of Natural Language Processing. For this reason, there are numerous highly specialized short text classifiers. However, in recent short text research, State of the Art (SOTA) methods for traditional text classification, particularly the pure use of Transformers, have been unexploited. In this work, we examine the performance of a variety of short text classifiers as well as the top performing traditional text classifier. We further investigate the effects on two new real-world short text datasets in an effort to address the issue of becoming overly dependent on benchmark datasets with a limited number of characteristics. Our experiments unambiguously demonstrate that Transformers achieve SOTA accuracy on short text classification tasks, raising the question of whether specialized short text techniques are necessary.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源