A statistical method is applied to explore the unique characteristics of a certain class of neural network autoassociative memory with neurons and first-order synaptic interconnections. The memory matrix is constructed to store = α vectors based on the outer-product learning algorithm. We theoretically prove that, by setting all the diagonal terms of the memory matrix to be and letting the input error ratio ρ = 0, the probability of successful recall steadily decreases as α increases, but as α increases past 1.0, begins to increase slowly. When 0 < ρ ≤ 0.5, the network exhibits strong error-correction capability if α ≤ 0.15 and this capability is shown to rapidly decrease as α increases. The network essentially loses all its error-correction capability at α = 2, regardless of the value of ρ. When 0 < ρ ≤ 0.5, and under the constraint of > 0.99, the tradeoff between the number of stable states and their attraction force is analyzed and the memory capacity is shown to be 0.15 at best.

Download full-text PDF

Source
http://dx.doi.org/10.1162/neco.1991.3.3.428DOI Listing

Publication Analysis

Top Keywords

memory matrix
12
autoassociative memory
8
terms memory
8
error-correction capability
8
capability ≤
8
memory
6
characteristics autoassociative
4
memory nonzero-diagonal
4
nonzero-diagonal terms
4
matrix statistical
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!