论文标题

通过正则化和归一化改善尖峰神经网络中的替代梯度学习

Improving Surrogate Gradient Learning in Spiking Neural Networks via Regularization and Normalization

论文作者

Meda, Nandan

论文摘要

尖峰神经网络(SNN)与深度学习中使用的经典网络不同:使用称为尖刺的电脉冲进行通信,就像生物神经元一样。 SNN吸引了AI技术,因为它们可以在低功率神经形态芯片上实施。但是,SNN通常比其模拟对应物的精度不那么准确。在本报告中,我们检查了各种正则化和归一化技术,目的是改善SNN中的替代梯度学习。

Spiking neural networks (SNNs) are different from the classical networks used in deep learning: the neurons communicate using electrical impulses called spikes, just like biological neurons. SNNs are appealing for AI technology, because they could be implemented on low power neuromorphic chips. However, SNNs generally remain less accurate than their analog counterparts. In this report, we examine various regularization and normalization techniques with the goal of improving surrogate gradient learning in SNNs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源