Search
https://zoom.us/j/475819702
Meeting ID: 475-819-702
If you haven't registered for previous QLS webinars, please contact qls@ictp.it to obtain the PASSWORD for this zoom meeting.
Abstract: Brains process information through the collective dynamics of large neural networks. Collective chaos was suggested to underlie the complex ongoing dynamics observed in cerebral cortical circuits and determine the impact and processing of incoming information streams. While dynamic mean-field theory has uncovered key properties of recurrent network models such as the onset of chaos and their largest Lyapunov exponent, fundamental features of their dynamics remain unknown. In particular, chaotic dynamics in dissipative high-dimensional systems takes place on a subset of phase space of reduced dimension and is organized by a complex tangle of stable, neutral and unstable manifolds. Key topological invariants of this phase space structure such as attractor dimension, and Kolmogorov-Sinai entropy so far remained elusive.
Here we calculate the complete Lyapunov spectrum of recurrent neural networks. We show that chaos in these networks is extensive with a size-invariant Lyapunov spectrum and characterized by attractor dimensions much smaller than the number of phase space dimensions. The attractor dimension and entropy rate increases with coupling strength near the onset of chaos but decreases far from onset, reflecting a reduction in the number of unstable directions. We find that near the onset of chaos, for very intense chaos, and discrete-time dynamics, random matrix theory provides good analytical approximations to the full Lyapunov spectrum. We show that a generalized time-reversal symmetry of the network dynamics induces a point-symmetry of the Lyapunov spectrum reminiscent of the symplectic structure of chaotic Hamiltonian systems. Temporally fluctuating input can drastically reduce both the entropy rate and the attractor dimension. For trained recurrent networks, we find that Lyapunov spectrum analysis provides a quantification of error propagation and stability achieved by distinct learning algorithms. Our methods apply to systems of arbitrary connectivity, and we describe a comprehensive set of controls for the accuracy and convergence of Lyapunov exponents.
Our results open a novel avenue for characterizing the complex dynamics of recurrent neural networks and the geometry of the corresponding high-dimensional chaotic attractor. They also highlight the potential of Lyapunov spectrum analysis as a diagnostic for machine learning applications of recurrent networks.