论文标题

通过多级相互作用转移进行弱监督的域适应性域名提取方面提取

Weakly-supervised Domain Adaption for Aspect Extraction via Multi-level Interaction Transfer

论文作者

Liang, Tao, Wang, Wenya, Lv, Fengmao

论文摘要

精细颗粒方面提取是基于方面意见分析的必不可少的子任务。它旨在确定每个句子中产品或服务的方面术语(又称意见目标)。但是,昂贵的注释过程通常涉及为每个域获得足够的令牌标签。为了解决这一限制,一些以前的作品提出了域的适应策略,将知识从足够标记的源域转移到未标记的目标域。但是,由于细粒度预测问题的难度和域之间的较大域间隙,性能仍然不令人满意。这项工作对利用句子级方面类别标签进行了先驱研究,通常可以在诸如审查网站之类的商业服务中获得,以促进令牌级别的转移以提取目的。具体而言,方面类别信息用于构建枢轴知识以进行转移,并假设句子级方面类别和令牌级方面术语之间的相互作用在跨域之间是不变的。为此,我们提出了一种新型的多级重建机制,该机制在多个级别的抽象层面上既将细粒度和粗粒度的信息都对齐。全面的实验表明,我们的方法可以充分利用句子级别的方面类别标签,以提高跨域方面的提取,并获得较大的性能增长。

Fine-grained aspect extraction is an essential sub-task in aspect based opinion analysis. It aims to identify the aspect terms (a.k.a. opinion targets) of a product or service in each sentence. However, expensive annotation process is usually involved to acquire sufficient token-level labels for each domain. To address this limitation, some previous works propose domain adaptation strategies to transfer knowledge from a sufficiently labeled source domain to unlabeled target domains. But due to both the difficulty of fine-grained prediction problems and the large domain gap between domains, the performance remains unsatisfactory. This work conducts a pioneer study on leveraging sentence-level aspect category labels that can be usually available in commercial services like review sites to promote token-level transfer for the extraction purpose. Specifically, the aspect category information is used to construct pivot knowledge for transfer with assumption that the interactions between sentence-level aspect category and token-level aspect terms are invariant across domains. To this end, we propose a novel multi-level reconstruction mechanism that aligns both the fine-grained and coarse-grained information in multiple levels of abstractions. Comprehensive experiments demonstrate that our approach can fully utilize sentence-level aspect category labels to improve cross-domain aspect extraction with a large performance gain.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源