论文标题
N-LTP:中文的开源神经语言技术平台
N-LTP: An Open-source Neural Language Technology Platform for Chinese
论文作者
论文摘要
We introduce \texttt{N-LTP}, an open-source neural language technology platform supporting six fundamental Chinese NLP tasks: {lexical analysis} (Chinese word segmentation, part-of-speech tagging, and named entity recognition), {syntactic parsing} (dependency parsing), and {semantic parsing} (semantic dependency parsing and semantic role labeling).与现有的最新工具包(例如\ texttt {stanza})采用每个任务的独立模型不同,\ texttt {n-ltp}通过使用共享的预培训模型采用多任务框架,这具有跨相关中文任务捕获共享知识的优势。此外,知识蒸馏方法\ cite {dblp:journals/corr/abs-1907-04829},其中单任务模型会进一步介绍多任务模型,以鼓励多任务模型超过其单件任务教师。最后,我们提供了易于使用的API和可视化工具的集合,以使用户更容易,更容易地直接使用处理结果。据我们所知,这是第一个支持六个中国NLP基本任务的工具包。源代码,文档和预培训模型可在\ url {https://github.com/hit-scir/ltp}上获得。
We introduce \texttt{N-LTP}, an open-source neural language technology platform supporting six fundamental Chinese NLP tasks: {lexical analysis} (Chinese word segmentation, part-of-speech tagging, and named entity recognition), {syntactic parsing} (dependency parsing), and {semantic parsing} (semantic dependency parsing and semantic role labeling). Unlike the existing state-of-the-art toolkits, such as \texttt{Stanza}, that adopt an independent model for each task, \texttt{N-LTP} adopts the multi-task framework by using a shared pre-trained model, which has the advantage of capturing the shared knowledge across relevant Chinese tasks. In addition, a knowledge distillation method \cite{DBLP:journals/corr/abs-1907-04829} where the single-task model teaches the multi-task model is further introduced to encourage the multi-task model to surpass its single-task teacher. Finally, we provide a collection of easy-to-use APIs and a visualization tool to make users to use and view the processing results more easily and directly. To the best of our knowledge, this is the first toolkit to support six Chinese NLP fundamental tasks. Source code, documentation, and pre-trained models are available at \url{https://github.com/HIT-SCIR/ltp}.