论文标题

将卷积与复发网络相结合以进行文本分类

Combine Convolution with Recurrent Networks for Text Classification

论文作者

Lyu, Shengfei, Liu, Jiaqi

论文摘要

卷积神经网络(CNN)和复发性神经网络(RNN)是文本分类中使用的两个流行架构。结合两个网络优势的传统方法依赖于简化它们或从中提取的连接功能。在本文中,我们提出了一种新颖的方法,以在很大程度上保持两个网络的优势。在提出的模型中,应用了一个卷积神经网络来学习一个2D权重矩阵,其中每一行都反映了每个单词从不同方面的重要性。同时,我们使用双向RNN来处理每个单词并采用神经张量层,该神经张量层融合向前和向后隐藏的状态以获取单词表示。最后,将重量矩阵和单词表示形式组合在一起,以以2D矩阵形式获得文本的表示形式。我们对许多数据集进行了实验,以进行文本分类。实验结果证实了该方法的有效性。

Convolutional neural network (CNN) and recurrent neural network (RNN) are two popular architectures used in text classification. Traditional methods to combine the strengths of the two networks rely on streamlining them or concatenating features extracted from them. In this paper, we propose a novel method to keep the strengths of the two networks to a great extent. In the proposed model, a convolutional neural network is applied to learn a 2D weight matrix where each row reflects the importance of each word from different aspects. Meanwhile, we use a bi-directional RNN to process each word and employ a neural tensor layer that fuses forward and backward hidden states to get word representations. In the end, the weight matrix and word representations are combined to obtain the representation in a 2D matrix form for the text. We carry out experiments on a number of datasets for text classification. The experimental results confirm the effectiveness of the proposed method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源