Neural circuits in the brain perform a variety of essential functions, including input classification, pattern completion, and the generation of rhythms and oscillations that support processes such as breathing and locomotion [51]. There is also substantial evidence that the brain encodes memories and processes information via of neural activity. In this dissertation, we are focused on the general problem of how neural circuits encode rhythmic activity, as in central pattern generators (CPGs), as well as the encoding of sequences. Traditionally, rhythmic activity and CPGs have been modeled using coupled oscillators. Here we take a different approach, and present models for several different neural functions using threshold-linear networks. Our approach aims to unify attractor-based models (e.g., Hopfield networks) which encode static and dynamic patterns as attractors of the network. In the first half of this dissertation, we present several attractor-based models. These include: a network that can count the number of external inputs it receives; two models for locomotion, one encoding five different quadruped gaits and another encoding the orientation system of a swimming mollusk; and, finally, a model that connects the fixed point sequences with locomotion attractors to obtain a network that steps through a sequence of dynamic attractors. In the second half of the thesis, we present new theoretical results, some of which have already been published in [59]. There, we established conditions on network architectures to produce sequential attractors. Here we also include several new theorems relating the fixed points of composite networks to those of their component subnetworks, as well as a new architecture for layering networks which produces "fusion" attractors by minimizing interference between the attractors of individual layers.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11527095 | PMC |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!