论文标题
弱关联的突触可促进深度神经网络的尺寸降低
Weakly-correlated synapses promote dimension reduction in deep neural networks
论文作者
论文摘要
通过控制突触和神经相关性,深度学习在改善分类性能方面取得了经验成功。突触相关性如何影响神经相关性以产生分离的隐藏表示形式仍然难以捉摸。在这里,我们提出了一个简化的降低模型,考虑到突触之间的成对相关性,以揭示突触相关性如何影响尺寸降低的机制。我们的理论决定了仅需要数学自相矛盾的二进制和连续突触的数学自相连的突触相关缩放形式。该理论还预测,与正交同管相比,弱关联的突触会降低尺寸。此外,这些突触沿网络深度降低了去相关过程。这两个计算角色由提出的平均场方程解释。理论上的预测与数值模拟非常吻合,并且通过Hebbian规则深入学习也可以捕获关键特征。
By controlling synaptic and neural correlations, deep learning has achieved empirical successes in improving classification performances. How synaptic correlations affect neural correlations to produce disentangled hidden representations remains elusive. Here we propose a simplified model of dimension reduction, taking into account pairwise correlations among synapses, to reveal the mechanism underlying how the synaptic correlations affect dimension reduction. Our theory determines the synaptic-correlation scaling form requiring only mathematical self-consistency, for both binary and continuous synapses. The theory also predicts that weakly-correlated synapses encourage dimension reduction compared to their orthogonal counterparts. In addition, these synapses slow down the decorrelation process along the network depth. These two computational roles are explained by the proposed mean-field equation. The theoretical predictions are in excellent agreement with numerical simulations, and the key features are also captured by a deep learning with Hebbian rules.