Stability in recurrent neural models poses a significant challenge, particularly in developing biologically plausible neurodynamical models that can be seamlessly trained. Traditional cortical circuit models are notoriously difficult to train due to expansive nonlinearities in the dynamical system, leading to an optimization problem with nonlinear stability constraints that are difficult to impose. Conversely, recurrent neural networks (RNNs) excel in tasks involving sequential data but lack biological plausibility and interpretability. In this work, we address these challenges by linking dynamic divisive normalization (DN) to the stability of "oscillatory recurrent gated neural integrator circuits" (ORGaNICs), a biologically plausible recurrent cortical circuit model that dynamically achieves DN and that has been shown to simulate a wide range of neurophysiological phenomena. By using the indirect method of Lyapunov, we prove the remarkable property of unconditional local stability for an arbitrary-dimensional ORGaNICs circuit when the recurrent weight matrix is the identity. We thus connect ORGaNICs to a system of coupled damped harmonic oscillators, which enables us to derive the circuit's energy function, providing a normative principle of what the circuit, and individual neurons, aim to accomplish. Further, for a generic recurrent weight matrix, we prove the stability of the 2D model and demonstrate empirically that stability holds in higher dimensions. Finally, we show that ORGaNICs can be trained by backpropagation through time without gradient clipping/scaling, thanks to its intrinsic stability property and adaptive time constants, which address the problems of exploding, vanishing, and oscillating gradients. By evaluating the model's performance on RNN benchmarks, we find that ORGaNICs outperform alternative neurodynamical models on static image classification tasks and perform comparably to LSTMs on sequential tasks.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11469413PMC

Publication Analysis

Top Keywords

recurrent neural
12
stability recurrent
8
divisive normalization
8
normalization stability
8
biologically plausible
8
neurodynamical models
8
cortical circuit
8
recurrent weight
8
weight matrix
8
recurrent
7

Similar Publications

Objective: Creating an intracortical brain-computer interface (iBCI) capable of seamless transitions between tasks and contexts would greatly enhance user experience. However, the nonlinearity in neural activity presents challenges to computing a global iBCI decoder. We aimed to develop a method that differs from a globally optimized decoder to address this issue.

View Article and Find Full Text PDF

Adaptation optimizes sensory encoding for future stimuli.

PLoS Comput Biol

January 2025

Department of Psychology, University of Pennsylvania, Philadelphia, Pennsylvania, United States of America.

Sensory neurons continually adapt their response characteristics according to recent stimulus history. However, it is unclear how such a reactive process can benefit the organism. Here, we test the hypothesis that adaptation actually acts proactively in the sense that it optimally adjusts sensory encoding for future stimuli.

View Article and Find Full Text PDF

Control of striatal circuit development by the chromatin regulator .

Sci Adv

January 2025

Department of Neuroscience, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA 19104, USA.

The pathophysiology of neurodevelopmental disorders involves vulnerable neural populations, including striatal circuitry, and convergent molecular nodes, including chromatin regulation and synapse function. Despite this, how epigenetic regulation regulates striatal development is understudied. Recurrent de novo mutations in are associated with intellectual disability and autism.

View Article and Find Full Text PDF

Time series is a data structure prevalent in a wide range of fields such as healthcare, finance and meteorology. It goes without saying that analyzing time series data holds the key to gaining insight into our day-to-day observations. Among the vast spectrum of time series analysis, time series classification offers the unique opportunity to classify the sequences into their respective categories for the sake of automated detection.

View Article and Find Full Text PDF

Recurrent neural networks (RNNs) based on model neurons that communicate via continuous signals have been widely used to study how cortical neural circuits perform cognitive tasks. Training such networks to perform tasks that require information maintenance over a brief period (i.e.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!