15–17 Sept 2025
Centro Polifunzionale Studenti Università di Bari
Europe/Rome timezone
CSS/ITALY 2025

Incorporating Structure of Interactions and Effective Learning in Random Neural Networks

17 Sept 2025, 12:20
20m
Centro Polifunzionale Studenti Università di Bari

Centro Polifunzionale Studenti Università di Bari

Speaker

L. Taffarello

Description

The collective dynamics of complex systems such as neural circuits are shaped by how their units interact. Theoretical models of recurrent neural networks (RNNs) with random interactions have been investigated by dynamical mean field theory (DMFT) to derive fundamental properties of their dynamics such as the onset of chaos. However, the fully-interacting systems considered in previous studies diverges from the biological realism, as neural circuits exhibit a significant structural heterogeneity. In this work, we introduce degree-dependent connectivity structure in RNNs by drawing node degrees from a lognormal distribution, consistent with a large amount of empirical data [1]. We extend DMFT to account for both randomness in interaction strength and structured connectivity, deriving an effective dynamical equation for a neuron of given degree. Linear stability analysis reveals that the critical coupling strength for the transition to chaos depends inversely on the coefficient of variation of the degree distribution. Hence, greater degree heterogeneity lowers network stability. Moreover, the eigenvalue spectrum of the stability matrix reveals that heterogenous networks exhibit nonuniform spectral density, with unstable modes localized in high-degree nodes that drive the dynamics of the network. Importantly, incorporating learning-induced positive symmetric weight correlations introduces a non-Markovian memory term to the dynamical equations. The effect on the dynamics is shown to be equivalent of effective recurrent self-interactions [2], with magnitude proportional to the number of connections of the nodes. Remarkably, a partially-symmetric and heterogeneous structure of interactions generates the emergence of a long-tailed distribution of intrinsic timescales as observed in cortical circuits [3], with temporal activity of high-degree nodes fluctuating over slower timescales. Our findings suggest that heterogeneous structure of interactions and effective learning provide a biologically plausible microscopic mechanism to explain the complex temporal dynamics spanning over multiple timescales, while providing a theoretical framework to predict the structural functional relation in real neural circuits.

References:
[1] Ben Piazza et al. “Physical Network Constraints Define the Lognormal Architecture of the Brain’s Connectome”. In: bioRxiv (2025), pp. 2025–02.
[2] David G Clark, LF Abbott, and Ashok Litwin-Kumar. “Dimension of activity in random neural networks”. In: Physical Review Letters 131.11(2023), p. 118401.
[3] Merav Stern, Nicolae Istrate, and Luca Mazzucato. “A reservoir of timescales emerges in recurrent circuits with heterogeneous neural assemblies”. In: Elife 12 (2023), e86552

Presentation materials