论文标题
朝着基于BERT的语法误差校正最小的监督
Towards Minimal Supervision BERT-based Grammar Error Correction
论文作者
论文摘要
当前的语法误差校正(GEC)模型通常将任务视为序列的生成,这需要大量的带注释的数据并限制数据限制设置中的应用程序。我们尝试将上下文信息从预训练的语言模型中纳入上下文信息,以利用注释并使多语言方案受益。结果表明,在语法误差校正任务中,来自变压器(BERT)的双向编码器表示的强大潜力。
Current grammatical error correction (GEC) models typically consider the task as sequence generation, which requires large amounts of annotated data and limit the applications in data-limited settings. We try to incorporate contextual information from pre-trained language model to leverage annotation and benefit multilingual scenarios. Results show strong potential of Bidirectional Encoder Representations from Transformers (BERT) in grammatical error correction task.