In this letter, the computational power of a class of random access memory (RAM)-based neural networks, called general single-layer sequential weightless neural networks (GSSWNNs), is analyzed. The theoretical results presented, besides helping the understanding of the temporal behavior of these networks, could also provide useful insights for the developing of new learning algorithms.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TNN.2005.849838 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!