A note on computing with Kolmogorov Superpositions without iterations.

Neural Netw

Department of Mathematics, University of California at Santa Barbara, New York, NY 10019, United States of America. Electronic address:

Published: December 2021

We extend Kolmogorov's Superpositions to approximating arbitrary continuous functions with a noniterative approach that can be used by any neural network that uses these superpositions. Our approximation algorithm uses a modified dimension reducing function that allows for an increased number of summands to achieve an error bound commensurate with that of r iterations for any r. This new variant of Kolmogorov's Superpositions improves upon the original parallelism inherent in them by performing highly distributed parallel computations without synchronization. We note that this approach makes implementation much easier and more efficient on networks of modern parallel hardware, and thus makes it a more practical tool.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2021.07.006DOI Listing

Publication Analysis

Top Keywords

kolmogorov's superpositions
8
note computing
4
computing kolmogorov
4
superpositions
4
kolmogorov superpositions
4
superpositions iterations
4
iterations extend
4
extend kolmogorov's
4
superpositions approximating
4
approximating arbitrary
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!