论文标题

ASTDP:更生物学上合理的学习

aSTDP: A More Biologically Plausible Learning

论文作者

Li, Shiyuan

论文摘要

在生物学习过程中,生物神经网络中的峰值依赖性可塑性已被证明很重要。另一方面,人工神经网络使用不同的学习方式,例如后传播或对比性Hebbian学习。在这项工作中,我们介绍了近似STDP,这是一个新的神经网络学习框架,与生物学习过程更相似。它仅使用STDP规则进行监督和无监督的学习,每个神经元分发了学习模式,并且不需要全球损失或其他监督信息。我们还使用一种数值方式来近似每个神经元的衍生物,以便更好地使用SDTP学习,并使用衍生物为神经元设定目标以加速训练和测试过程。该框架可以在一个模型中做出预测或生成模式,而无需其他配置。最后,我们在MNIST数据集上验证了我们的框架进行分类和生成任务。

Spike-timing dependent plasticity in biological neural networks has been proven to be important during biological learning process. On the other hand, artificial neural networks use a different way to learn, such as Back-Propagation or Contrastive Hebbian Learning. In this work we introduce approximate STDP, a new neural networks learning framework more similar to the biological learning process. It uses only STDP rules for supervised and unsupervised learning, every neuron distributed learn patterns and don' t need a global loss or other supervised information. We also use a numerical way to approximate the derivatives of each neuron in order to better use SDTP learning and use the derivatives to set a target for neurons to accelerate training and testing process. The framework can make predictions or generate patterns in one model without additional configuration. Finally, we verified our framework on MNIST dataset for classification and generation tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源