In computational neuroscience, recurrent neural networks are widely used to model neural activity and learning. In many studies, fixed points of recurrent neural networks are used to model neural responses to static or slowly changing stimuli, such as visual cortical responses to static visual stimuli. These applications raise the question of how to train the weights in a recurrent neural network to minimize a loss function evaluated on fixed points.
View Article and Find Full Text PDFIn December 2021 the U.S. Government announced a new, whole-of-government $1.
View Article and Find Full Text PDFTo produce adaptive behavior, neural networks must balance between plasticity and stability. Computational work has demonstrated that network stability requires plasticity mechanisms to be counterbalanced by rapid compensatory processes. However, such processes have yet to be experimentally observed.
View Article and Find Full Text PDFBackpropagation is widely used to train artificial neural networks, but its relationship to synaptic plasticity in the brain is unknown. Some biological models of backpropagation rely on feedback projections that are symmetric with feedforward connections, but experiments do not corroborate the existence of such symmetric backward connectivity. Random feedback alignment offers an alternative model in which errors are propagated backward through fixed, random backward connections.
View Article and Find Full Text PDFThe brain is believed to operate in part by making predictions about sensory stimuli and encoding deviations from these predictions in the activity of "prediction error neurons." This principle defines the widely influential theory of predictive coding. The precise circuitry and plasticity mechanisms through which animals learn to compute and update their predictions are unknown.
View Article and Find Full Text PDFArtificial neural networks are often interpreted as abstract models of biological neuronal networks, but they are typically trained using the biologically unrealistic backpropagation algorithm and its variants. Predictive coding has been proposed as a potentially more biologically realistic alternative to backpropagation for training neural networks. This manuscript reviews and extends recent work on the mathematical relationship between predictive coding and backpropagation for training feedforward artificial neural networks on supervised learning tasks.
View Article and Find Full Text PDFThe dynamics of local cortical networks are irregular, but correlated. Dynamic excitatory-inhibitory balance is a plausible mechanism that generates such irregular activity, but it remains unclear how balance is achieved and maintained in plastic neural networks. In particular, it is not fully understood how plasticity induced changes in the network affect balance, and in turn, how correlated, balanced activity impacts learning.
View Article and Find Full Text PDFBackground: The proclivity to atlantoaxial instability (AAI) has been widely reported for conditions such as rheumatoid arthritis and Down syndrome. Similarly, we have found a higher than expected incidence of AAI in hereditary connective tissue disorders. We demonstrate a strong association of AAI with manifestations of dysautonomia, in particular syncope and lightheadedness, and make preliminary observations as to the salutary effect of surgical stabilization of the atlantoaxial motion segment.
View Article and Find Full Text PDFTask-related activity in the ventral thalamus, a major target of basal ganglia output, is often assumed to be permitted or triggered by changes in basal ganglia activity through gating- or rebound-like mechanisms. To test those hypotheses, we sampled single-unit activity from connected basal ganglia output and thalamic nuclei (globus pallidus-internus [GPi] and ventrolateral anterior nucleus [VLa]) in monkeys performing a reaching task. Rate increases were the most common peri-movement change in both nuclei.
View Article and Find Full Text PDFPLoS Comput Biol
September 2020
Balanced excitation and inhibition is widely observed in cortex. How does this balance shape neural computations and stimulus representations? This question is often studied using computational models of neuronal networks in a dynamically balanced state. But balanced network models predict a linear relationship between stimuli and population responses.
View Article and Find Full Text PDFAtlanto-axial instability (AAI) is common in the connective tissue disorders, such as rheumatoid arthritis, and increasingly recognized in the heritable disorders of Stickler, Loeys-Dietz, Marfan, Morquio, and Ehlers-Danlos (EDS) syndromes, where it typically presents as a rotary subluxation due to incompetence of the alar ligament. This retrospective, IRB-approved study examines 20 subjects with Fielding type 1 rotary subluxation, characterized by anterior subluxation of the facet on one side, with a normal atlanto-dental interval. Subjects diagnosed with a heritable connective tissue disorder, and AAI had failed non-operative treatment and presented with severe headache, neck pain, and characteristic neurological findings.
View Article and Find Full Text PDFNetworks of neurons in the cerebral cortex exhibit a balance between excitation (positive input current) and inhibition (negative input current). Balanced network theory provides a parsimonious mathematical model of this excitatory-inhibitory balance using randomly connected networks of model neurons in which balance is realized as a stable fixed point of network dynamics in the limit of large network size. Balanced network theory reproduces many salient features of cortical network dynamics such as asynchronous-irregular spiking activity.
View Article and Find Full Text PDFMany sectors within healthcare have adapted checklists to improve quality control. Notwithstanding the reported successful implementation of surgical checklists in the operating theater, a dearth of literature addresses the specific challenges posed by complex surgery in the craniocervical junction and spine. The authors devised an intraoperative checklist to address the common errors and verify the completion of objectives unique to these surgeries.
View Article and Find Full Text PDFProper craniocervical alignment during craniocervical reduction, stabilization, and fusion optimizes cerebrospinal fluid (CSF) flow through the foramen magnum, establishes the appropriate "gaze angle", avoids dysphagia and dyspnea, and, most importantly, normalizes the clival-axial angle (CXA) to reduce ventral brainstem compression. To illustrate the metrics of reduction that include CXA, posterior occipital cervical angle, orbital-axial or "gaze angle", and mandible-axial angle, we present a video illustration of a patient presenting with signs and symptoms of the cervical medullary syndrome along with concordant radiographic findings of craniocervical instability as identified on dynamic imaging and through assessment of the CXA, Harris, and Grabb-Oakes measurements.
View Article and Find Full Text PDFA major goal in neuroscience is to estimate neural connectivity from large scale extracellular recordings of neural activity in vivo. This is challenging in part because any such activity is modulated by the unmeasured external synaptic input to the network, known as the common input problem. Many different measures of functional connectivity have been proposed in the literature, but their direct relationship to synaptic connectivity is often assumed or ignored.
View Article and Find Full Text PDFUnderstanding the magnitude and structure of interneuronal correlations and their relationship to synaptic connectivity structure is an important and difficult problem in computational neuroscience. Early studies show that neuronal network models with excitatory-inhibitory balance naturally create very weak spike train correlations, defining the "asynchronous state." Later work showed that, under some connectivity structures, balanced networks can produce larger correlations between some neuron pairs, even when the average correlation is very small.
View Article and Find Full Text PDFReservoir computing is a biologically inspired class of learning algorithms in which the intrinsic dynamics of a recurrent neural network are mined to produce target time series. Most existing reservoir computing algorithms rely on fully supervised learning rules, which require access to an exact copy of the target response, greatly reducing the utility of the system. Reinforcement learning rules have been developed for reservoir computing, but we find that they fail to converge on complex motor tasks.
View Article and Find Full Text PDFThe ongoing acquisition of large and multifaceted data sets in neuroscience requires new mathematical tools for quantitatively grounding these experimental findings. Since 2015, the International Conference on Mathematical Neuroscience (ICMNS) has provided a forum for researchers to discuss current mathematical innovations emerging in neuroscience. This special issue assembles current research and tutorials that were presented at the 2017 ICMNS held in Boulder, Colorado from May 30 to June 2.
View Article and Find Full Text PDFTrial-to-trial variability is a reflection of the circuitry and cellular physiology that make up a neuronal network. A pervasive yet puzzling feature of cortical circuits is that despite their complex wiring, population-wide shared spiking variability is low dimensional. Previous model cortical networks cannot explain this global variability, and rather assume it is from external sources.
View Article and Find Full Text PDFOn a Sunday morning at 06:22 on October 23, 1983, in Beirut, Lebanon, a semitrailer filled with TNT sped through the guarded barrier into the ground floor of the Civilian Aviation Authority and exploded, killing and wounding US Marines from the 1st Battalion 8th Regiment (2nd Division), as well as the battalion surgeon and deployed corpsmen. The truck bomb explosion, estimated to be the equivalent of 21,000 lbs of TNT, and regarded as the largest nonnuclear explosion since World War II, caused what was then the most lethal single-day death toll for the US Marine Corps since the Battle of Iwo Jima in World War II. Considerable neurological injury resulted from the bombing.
View Article and Find Full Text PDFMany studies have investigated the benefits of androgen therapy and neurosteroids in aging men, while concerns remain about the potential associations of exogenous steroids and incidents of cerebrovascular events and ischemic stroke (IS). Testosterone is neuroprotective, neurotrophic and a potent stimulator of neuroplasticity. These benefits are mediated primarily through conversion of a small amount of testosterone to estradiol by the catalytic activity of estrogen synthetase (aromatase cytochrome P450 enzyme).
View Article and Find Full Text PDFUnderstanding the relationship between external stimuli and the spiking activity of cortical populations is a central problem in neuroscience. Dense recurrent connectivity in local cortical circuits can lead to counterintuitive response properties, raising the question of whether there are simple arithmetical rules for relating circuits' connectivity structure to their response properties. One such arithmetic is provided by the mean field theory of balanced networks, which is derived in a limit where excitatory and inhibitory synaptic currents precisely balance on average.
View Article and Find Full Text PDFAn essential step toward understanding neural circuits is linking their structure and their dynamics. In general, this relationship can be almost arbitrarily complex. Recent theoretical work has, however, begun to identify some broad principles underlying collective spiking activity in neural circuits.
View Article and Find Full Text PDFPhys Rev Lett
January 2017
Randomly connected networks of excitatory and inhibitory spiking neurons provide a parsimonious model of neural variability, but are notoriously unreliable for performing computations. We show that this difficulty is overcome by incorporating the well-documented dependence of connection probability on distance. Spatially extended spiking networks exhibit symmetry-breaking bifurcations and generate spatiotemporal patterns that can be trained to perform dynamical computations under a reservoir computing framework.
View Article and Find Full Text PDF