The impact of sparsity in low-rank recurrent neural networks.

PLoS Comput Biol

Laboratoire de Neurosciences Cognitives et Computationnelles, Département d'Études Cognitives, INSERM U960, École Normale Supérieure - PSL University, Paris, France.

Published: August 2022

Neural population dynamics are often highly coordinated, allowing task-related computations to be understood as neural trajectories through low-dimensional subspaces. How the network connectivity and input structure give rise to such activity can be investigated with the aid of low-rank recurrent neural networks, a recently-developed class of computational models which offer a rich theoretical framework linking the underlying connectivity structure to emergent low-dimensional dynamics. This framework has so far relied on the assumption of all-to-all connectivity, yet cortical networks are known to be highly sparse. Here we investigate the dynamics of low-rank recurrent networks in which the connections are randomly sparsified, which makes the network connectivity formally full-rank. We first analyse the impact of sparsity on the eigenvalue spectrum of low-rank connectivity matrices, and use this to examine the implications for the dynamics. We find that in the presence of sparsity, the eigenspectra in the complex plane consist of a continuous bulk and isolated outliers, a form analogous to the eigenspectra of connectivity matrices composed of a low-rank and a full-rank random component. This analogy allows us to characterise distinct dynamical regimes of the sparsified low-rank network as a function of key network parameters. Altogether, we find that the low-dimensional dynamics induced by low-rank connectivity structure are preserved even at high levels of sparsity, and can therefore support rich and robust computations even in networks sparsified to a biologically-realistic extent.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9390915PMC
http://dx.doi.org/10.1371/journal.pcbi.1010426DOI Listing

Publication Analysis

Top Keywords

low-rank recurrent
12
impact sparsity
8
recurrent neural
8
neural networks
8
network connectivity
8
connectivity structure
8
low-dimensional dynamics
8
low-rank connectivity
8
connectivity matrices
8
low-rank
7

Similar Publications

Networks of excitatory and inhibitory (EI) neurons form a canonical circuit in the brain. Seminal theoretical results on dynamics of such networks are based on the assumption that synaptic strengths depend on the type of neurons they connect, but are otherwise statistically independent. Recent synaptic physiology datasets however highlight the prominence of specific connectivity patterns that go well beyond what is expected from independent connections.

View Article and Find Full Text PDF

The goal of theoretical neuroscience is to develop models that help us better understand biological intelligence. Such models range broadly in complexity and biological detail. For example, task-optimized recurrent neural networks (RNNs) have generated hypotheses about how the brain may perform various computations, but these models typically assume a fixed weight matrix representing the synaptic connectivity between neurons.

View Article and Find Full Text PDF

Uterine adenosarcoma: Clinical significance of histological classification and SNP array analysis.

Hum Pathol

June 2024

Department of Pathology and Laboratory Medicine, Gustave Roussy, Villejuif, France.

Mullerian adenosarcoma is a rare malignant biphasic tumor. The mesenchymal component may be low or high grade, with or without sarcomatous overgrowth (SO). Little is known about the molecular heterogeneity of these tumors.

View Article and Find Full Text PDF

Approximating Nonlinear Functions With Latent Boundaries in Low-Rank Excitatory-Inhibitory Spiking Networks.

Neural Comput

April 2024

Champalimaud Neuroscience Programme, Champalimaud Foundation, 1400-038 Lisbon, Portugal

Deep feedforward and recurrent neural networks have become successful functional models of the brain, but they neglect obvious biological details such as spikes and Dale's law. Here we argue that these details are crucial in order to understand how real neural circuits operate. Towards this aim, we put forth a new framework for spike-based computation in low-rank excitatory-inhibitory spiking networks.

View Article and Find Full Text PDF

Neural circuits are composed of multiple regions, each with rich dynamics and engaging in communication with other regions. The combination of local, within-region dynamics and global, network-level dynamics is thought to provide computational flexibility. However, the nature of such multiregion dynamics and the underlying synaptic connectivity patterns remain poorly understood.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!