论文标题

复发神经网络的基于编码的内存模块

Encoding-based Memory Modules for Recurrent Neural Networks

论文作者

Carta, Antonio, Sperduti, Alessandro, Bacciu, Davide

论文摘要

学习通过复发模型求解顺序任务需要记住长序列并从中提取与任务相关的功能的能力。在本文中,我们从复发性神经网络的设计和培训的角度研究了记忆子任务。我们提出了一个新模型,即线性内存网络,该模型具有一个基于编码的记忆组件,该组件由用于序列的线性自动编码器构建。我们使用模块化存储器扩展记忆成分,该模块内存在不同的采样频率下编码隐藏状态序列。此外,我们还提供了一种专门的培训算法,该算法可以初始化内存以有效地编码网络的隐藏激活。关于合成和现实世界数据集的实验结果表明,专门对训练算法进行训练的记忆组件始终会在解决问题的情况下,始终会改善最终性能。

Learning to solve sequential tasks with recurrent models requires the ability to memorize long sequences and to extract task-relevant features from them. In this paper, we study the memorization subtask from the point of view of the design and training of recurrent neural networks. We propose a new model, the Linear Memory Network, which features an encoding-based memorization component built with a linear autoencoder for sequences. We extend the memorization component with a modular memory that encodes the hidden state sequence at different sampling frequencies. Additionally, we provide a specialized training algorithm that initializes the memory to efficiently encode the hidden activations of the network. The experimental results on synthetic and real-world datasets show that specializing the training algorithm to train the memorization component always improves the final performance whenever the memorization of long sequences is necessary to solve the problem.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源