论文标题

混乱的复发神经网络的Lyapunov光谱

Lyapunov spectra of chaotic recurrent neural networks

论文作者

Engelken, Rainer, Wolf, Fred, Abbott, L. F.

论文摘要

大脑通过大型神经网络的集体动态处理信息。建议集体混乱是为了基于在脑皮质回路中观察到的复杂的持续动态,并确定传入信息流的影响和处理。在耗散系统中,混沌动力学发生在降低尺寸的相位空间的子集中,并通过稳定,中性和不稳定的歧管的复杂缠结进行组织。到目前为止,该相空间结构(例如吸引子维度)和kolmogorov-sinai熵的关键拓扑不变性仍然难以捉摸。 在这里,我们计算了复发性神经网络的完整Lyapunov频谱。我们表明,这些网络中的混乱是广泛的,具有尺寸不变的lyapunov频谱,其特征是吸引子维度比相位空间尺寸的数量小得多。我们发现,对于混乱的发作,对于非常激烈的混乱和离散的时间动力学,随机矩阵理论提供了对完整Lyapunov频谱的分析近似。我们表明,网络动力学的广义时间反向对称性引起了Lyapunov Spectrum的点对称性,让人联想到混乱的哈密顿系统的符号结构。波动的输入降低了熵率和吸引子维度。对于训练有素的复发网络,我们发现Lyapunov Spectrum分析提供了实现误差传播和稳定性的量化。我们的方法适用于任意连通性系统,我们描述了一组全面的控制,以实现Lyapunov指数的准确性和收敛性。 我们的结果为表征复发性神经网络的复杂动力学和相应混沌吸引子的几何形状开辟了一条新颖的途径。他们还强调了Lyapunov Spectrum Analysis的潜力,作为复发网络的机器学习应用的诊断。

Brains process information through the collective dynamics of large neural networks. Collective chaos was suggested to underlie the complex ongoing dynamics observed in cerebral cortical circuits and determine the impact and processing of incoming information streams. In dissipative systems, chaotic dynamics takes place on a subset of phase space of reduced dimensionality and is organized by a complex tangle of stable, neutral and unstable manifolds. Key topological invariants of this phase space structure such as attractor dimension, and Kolmogorov-Sinai entropy so far remained elusive. Here we calculate the complete Lyapunov spectrum of recurrent neural networks. We show that chaos in these networks is extensive with a size-invariant Lyapunov spectrum and characterized by attractor dimensions much smaller than the number of phase space dimensions. We find that near the onset of chaos, for very intense chaos, and discrete-time dynamics, random matrix theory provides analytical approximations to the full Lyapunov spectrum. We show that a generalized time-reversal symmetry of the network dynamics induces a point-symmetry of the Lyapunov spectrum reminiscent of the symplectic structure of chaotic Hamiltonian systems. Fluctuating input reduces both the entropy rate and the attractor dimension. For trained recurrent networks, we find that Lyapunov spectrum analysis provides a quantification of error propagation and stability achieved. Our methods apply to systems of arbitrary connectivity, and we describe a comprehensive set of controls for the accuracy and convergence of Lyapunov exponents. Our results open a novel avenue for characterizing the complex dynamics of recurrent neural networks and the geometry of the corresponding chaotic attractors. They also highlight the potential of Lyapunov spectrum analysis as a diagnostic for machine learning applications of recurrent networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源