论文标题

语境化基于注意力的知识转移用于口语对话问题回答

Contextualized Attention-based Knowledge Transfer for Spoken Conversational Question Answering

论文作者

You, Chenyu, Chen, Nuo, Zou, Yuexian

论文摘要

口语对话问题回答(SCQA)要求机器在鉴于语音发言和文本语料库的情况下对复杂的对话流进行建模。与传统的文本问答(QA)任务不同,SCQA涉及音频信号处理,通道理解和上下文理解。但是,ASR系统向转录引入意外的嘈杂信号,从而导致SCQA的性能降解。为了克服问题,我们提出了一种基于情境化的蒸馏方法,它提出了一种新型的基于情境化的蒸馏方法,该方法既采用跨注意事项又采用自我注意力,以获取段落和对话历史记录的ASR稳定上下文化的嵌入式表示。我们还介绍了传统的知识蒸馏框架,以将ASR的知识从教师模型的估计概率提炼为学生。我们对语音数据集进行了广泛的实验,并证明我们的方法在此任务中取得了出色的性能。

Spoken conversational question answering (SCQA) requires machines to model complex dialogue flow given the speech utterances and text corpora. Different from traditional text question answering (QA) tasks, SCQA involves audio signal processing, passage comprehension, and contextual understanding. However, ASR systems introduce unexpected noisy signals to the transcriptions, which result in performance degradation on SCQA. To overcome the problem, we propose CADNet, a novel contextualized attention-based distillation approach, which applies both cross-attention and self-attention to obtain ASR-robust contextualized embedding representations of the passage and dialogue history for performance improvements. We also introduce the spoken conventional knowledge distillation framework to distill the ASR-robust knowledge from the estimated probabilities of the teacher model to the student. We conduct extensive experiments on the Spoken-CoQA dataset and demonstrate that our approach achieves remarkable performance in this task.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源