The synapses of real neural systems seem to have delays. Therefore, it is worthwhile to analyze associative memory models with delayed synapses. Thus, a sequential associative memory model with delayed synapses is discussed, where a discrete synchronous updating rule and a correlation learning rule are employed. Its dynamic properties are analyzed by the statistical neurodynamics. In this paper, we first re-derive the Yanai-Kim theory, which involves macrodynamical equations for the dynamics of the network with serial delay elements. Since their theory needs a computational complexity of O(L4t) to obtain the macroscopic state at time step t where L is the length of delay, it is intractable to discuss the macroscopic properties for a large L limit. Thus, we derive steady state equations using the discrete Fourier transformation, where the computational complexity does not formally depend on L. We show that the storage capacity alphaC is in proportion to the delay length L with a large L limit, and the proportion constant is 0.195, i.e. alphaC=0.195L. These results are supported by computer simulations.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/S0893-6080(03)00207-7 | DOI Listing |
Front Comput Neurosci
January 2025
Center for Synaptic Brain Dysfunctions, Institute for Basic Science, Daejeon, Republic of Korea.
Memory consolidation refers to the process of converting temporary memories into long-lasting ones. It is widely accepted that new experiences are initially stored in the hippocampus as rapid associative memories, which then undergo a consolidation process to establish more permanent traces in other regions of the brain. Over the past two decades, studies in humans and animals have demonstrated that the hippocampus is crucial not only for memory but also for imagination and future planning, with the CA3 region playing a pivotal role in generating novel activity patterns.
View Article and Find Full Text PDFHumans excel at applying learned behavior to unlearned situations. A crucial component of this generalization behavior is our ability to compose/decompose a whole into reusable parts, an attribute known as compositionality. One of the fundamental questions in robotics concerns this characteristic: How can linguistic compositionality be developed concomitantly with sensorimotor skills through associative learning, particularly when individuals only learn partial linguistic compositions and their corresponding sensorimotor patterns? To address this question, we propose a brain-inspired neural network model that integrates vision, proprioception, and language into a framework of predictive coding and active inference on the basis of the free-energy principle.
View Article and Find Full Text PDFFront Psychol
January 2025
Sorbonne University, CNRS, INSERM, Institute of Biology Paris Seine, Neurosciences Paris Seine, Paris, France.
Transitive inference, the ability to establish hierarchical relationships between stimuli, is typically tested by training with premise pairs (e.g., A + B-, B + C-, C + D-, D + E-), which establishes a stimulus hierarchy (A > B > C > D > E).
View Article and Find Full Text PDFFront Behav Neurosci
January 2025
Department of Neuroscience, Erasmus MC, Rotterdam, Netherlands.
Introduction: Physical exercise has repeatedly been reported to have advantageous effects on brain functions, including learning and memory formation. However, objective tools to measure such effects are often lacking. Eyeblink conditioning is a well-characterized method for studying the neural basis of associative learning.
View Article and Find Full Text PDFNature
January 2025
Department of Brain and Cognitive Sciences and McGovern Institute, MIT, Cambridge, MA, USA.
Hippocampal circuits in the brain enable two distinct cognitive functions: the construction of spatial maps for navigation, and the storage of sequential episodic memories. Although there have been advances in modelling spatial representations in the hippocampus, we lack good models of its role in episodic memory. Here we present a neocortical-entorhinal-hippocampal network model that implements a high-capacity general associative memory, spatial memory and episodic memory.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!