Among many k -winners-take-all ( k WTA) models, the dual-neural network (DNN- k WTA) model is with significantly less number of connections. However, for analog realization, noise is inevitable and affects the operational correctness of the k WTA process. Most existing results focus on the effect of additive noise. This brief studies the effect of time-varying multiplicative input noise. Two scenarios are considered. The first one is the bounded noise case, in which only the noise range is known. Another one is for the general noise distribution case, in which we either know the noise distribution or have noise samples. For each scenario, we first prove the convergence property of the DNN- k WTA model under multiplicative input noise and then provide an efficient method to determine whether a noise-affected DNN- k WTA network performs the correct k WTA process for a given set of inputs. With the two methods, we can efficiently measure the probability of the network performing the correct k WTA process. In addition, for the case of the inputs being uniformly distributed, we derive two closed-form expressions, one for each scenario, for estimating the probability of the model having correct operation. Finally, we conduct simulations to verify our theoretical results.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNNLS.2023.3317135DOI Listing

Publication Analysis

Top Keywords

dnn- wta
16
wta model
12
wta process
12
noise
10
time-varying multiplicative
8
wta
8
multiplicative input
8
input noise
8
case noise
8
noise distribution
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!