The aim of the present paper is to study the effects of Hebbian learning in random recurrent neural networks with biological connectivity, i.e. sparse connections and separate populations of excitatory and inhibitory neurons. We furthermore consider that the neuron dynamics may occur at a (shorter) time scale than synaptic plasticity and consider the possibility of learning rules with passive forgetting. We show that the application of such Hebbian learning leads to drastic changes in the network dynamics and structure. In particular, the learning rule contracts the norm of the weight matrix and yields a rapid decay of the dynamics complexity and entropy. In other words, the network is rewired by Hebbian learning into a new synaptic structure that emerges with learning on the basis of the correlations that progressively build up between neurons. We also observe that, within this emerging structure, the strongest synapses organize as a small-world network. The second effect of the decay of the weight matrix spectral radius consists in a rapid contraction of the spectral radius of the Jacobian matrix. This drives the system through the "edge of chaos" where sensitivity to the input pattern is maximal. Taken together, this scenario is remarkably predicted by theoretical arguments derived from dynamical systems and graph theory.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.jphysparis.2007.10.003 | DOI Listing |
bioRxiv
December 2024
Cold Spring Harbor Laboratory, Cold Spring Harbor, NY, USA.
Song acquisition behavior observed in the songbird system provides a notable example of learning through trial- and-error which parallels human speech acquisition. Studying songbird vocal learning can offer insights into mechanisms underlying human language. We present a computational model of song learning that integrates reinforcement learning (RL) and Hebbian learning and agrees with known songbird circuitry.
View Article and Find Full Text PDFSmall
January 2025
eNDR Laboratory, School of Physics, IISER Thiruvananthapuram, Trivandrum, Kerala, 695551, India.
Iontronic memtransistors have emerged as technologically superior to conventional memristors for neuromorphic applications due to their low operating voltage, additional gate control, and enhanced energy efficiency. In this study, a side-gated iontronic organic memtransistor (SG-IOMT) device is explored as a potential energy-efficient hardware building block for fast neuromorphic computing. Its operational flexibility, which encompasses the complex integration of redox activities, ion dynamics, and polaron generation, makes this device intriguing for simultaneous information storage and processing, as it effectively overcomes the von Neumann bottleneck of conventional computing.
View Article and Find Full Text PDFMol Cell Proteomics
December 2024
Department of Pharmacology and Toxicology, University of Texas Medical Branch.
Mater Horiz
January 2025
School of Chemical Sciences, National Institute of Science Education and Research (NISER), An OCC of HBNI, Bhubaneswar, 752050, Odisha, India.
Neuromorphic and fully analog in-memory computations are promising for handling vast amounts of data with minimal energy consumption. We have synthesized and studied a series of homo-bimetallic silver purine MOFs (1D and 2D) having direct metal-metal bonding. The N7-derivatized purine ligands are designed to form bi-metallic complexes under ambient conditions, extending to a 1D or 2D metal-organic framework.
View Article and Find Full Text PDFCogn Neurodyn
December 2024
Department of Computing, Goldsmiths - University of London, London, UK.
The ability to coactivate (or "superpose") multiple conceptual representations is a fundamental function that we constantly rely upon; this is crucial in complex cognitive tasks requiring multi-item working memory, such as mental arithmetic, abstract reasoning, and language comprehension. As such, an artificial system aspiring to implement any of these aspects of general intelligence should be able to support this operation. I argue here that standard, feed-forward deep neural networks (DNNs) are unable to implement this function, whereas an alternative, fully brain-constrained class of neural architectures spontaneously exhibits it.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!