Continuous attractor neural networks generate a set of smoothly connected attractor states. In memory systems of the brain, these attractor states may represent continuous pieces of information such as spatial locations and head directions of animals. However, during the replay of previous experiences, hippocampal neurons show a discontinuous sequence in which discrete transitions of the neural state are phase locked with the slow-gamma (∼30-50 Hz) oscillation. Here, we explore the underlying mechanisms of the discontinuous sequence generation. We find that a continuous attractor neural network has several phases depending on the interactions between external input and local inhibitory feedback. The discrete-attractor-like behavior naturally emerges in one of these phases without any discreteness assumption. We propose that the dynamics of continuous attractor neural networks is the key to generate discontinuous state changes phase locked to the brain rhythm.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1103/PhysRevLett.122.018102 | DOI Listing |
Sci Rep
January 2025
Department of General Psychology and Padova Neuroscience Center, University of Padova, Padova, Italy.
Hierarchical generative models can produce data samples based on the statistical structure of their training distribution. This capability can be linked to current theories in computational neuroscience, which propose that spontaneous brain activity at rest is the manifestation of top-down dynamics of generative models detached from action-perception cycles. A popular class of hierarchical generative models is that of Deep Belief Networks (DBNs), which are energy-based deep learning architectures that can learn multiple levels of representations in a completely unsupervised way exploiting Hebbian-like learning mechanisms.
View Article and Find Full Text PDFBiological memory networks are thought to store information by experience-dependent changes in the synaptic connectivity between assemblies of neurons. Recent models suggest that these assemblies contain both excitatory and inhibitory neurons (E/I assemblies), resulting in co-tuning and precise balance of excitation and inhibition. To understand computational consequences of E/I assemblies under biologically realistic constraints we built a spiking network model based on experimental data from telencephalic area Dp of adult zebrafish, a precisely balanced recurrent network homologous to piriform cortex.
View Article and Find Full Text PDFInt J Bipolar Disord
December 2024
Department for Psychiatry, Psychosomatic Medicine and Psychotherapy, University Hospital Frankfurt-Goethe University, Frankfurt am Main, Germany.
Background: Attention-deficit/hyperactivity disorder (ADHD) is a common neuro-developmental disorder that often persists into adulthood. Moreover, it is frequently accompanied by bipolar disorder (BD) as well as borderline personality disorder (BPD). It is unclear whether these disorders share underlying pathomechanisms, given that all three are characterized by alterations in affective states, either long or short-term.
View Article and Find Full Text PDFNeurosci Biobehav Rev
December 2024
BabyDevLab, School of Psychology, University of East London, Water Lane, London E15 4LZ, UK.
During early life, we develop the ability to choose what we focus on and what we ignore, allowing us to regulate perception and action in complex environments. But how does this change influence how we spontaneously allocate attention to real-world objects during free behaviour? Here, in this narrative review, we examine this question by considering the time dynamics of spontaneous overt visual attention, and how these develop through early life. Even in early childhood, visual attention shifts occur both periodically and aperiodically.
View Article and Find Full Text PDFCogn Neurodyn
December 2024
Research Centre of Mathematics, University of Minho, Guimarães, Portugal.
Continuous bump attractor networks (CANs) have been widely used in the past to explain the phenomenology of working memory (WM) tasks in which continuous-valued information has to be maintained to guide future behavior. Standard CAN models suffer from two major limitations: the stereotyped shape of the bump attractor does not reflect differences in the representational quality of WM items and the recurrent connections within the network require a biologically unrealistic level of fine tuning. We address both challenges in a two-dimensional (2D) network model formalized by two coupled neural field equations of Amari type.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!