Combinatorial threshold-linear networks (CTLNs) are a special class of recurrent neural networks whose dynamics are tightly controlled by an underlying directed graph. Recurrent networks have long been used as models for associative memory and pattern completion, with stable fixed points playing the role of stored memory patterns in the network. In prior work, we showed that of the graph correspond to stable fixed points of the dynamics, and we conjectured that these are the only stable fixed points possible [1, 2]. In this paper, we prove that the conjecture holds in a variety of special cases, including for networks with very strong inhibition and graphs of size . We also provide further evi-dence for the conjecture by showing that sparse graphs and graphs that are nearly cliques can never support stable fixed points. Finally, we translate some results from extremal com-binatorics to obtain an upper bound on the number of stable fixed points of CTLNs in cases where the conjecture holds.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10795766PMC
http://dx.doi.org/10.1016/j.aam.2023.102652DOI Listing

Publication Analysis

Top Keywords

stable fixed
24
fixed points
24
combinatorial threshold-linear
8
threshold-linear networks
8
conjecture holds
8
stable
6
points
6
networks
5
fixed
5
points combinatorial
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!