Rosenblatt's First Theorem and Frugality of Deep Learning.

Entropy (Basel)

Institute of Information Technologies, Mathematics and Mechanics, Lobachevsky State University, 603022 Nizhni Novgorod, Russia.

Published: November 2022

The Rosenblatt's first theorem about the omnipotence of shallow networks states that elementary perceptrons can solve any classification problem if there are no discrepancies in the training set. Minsky and Papert considered elementary perceptrons with restrictions on the neural inputs: a bounded number of connections or a relatively small diameter of the receptive field for each neuron at the hidden layer. They proved that under these constraints, an elementary perceptron cannot solve some problems, such as the connectivity of input images or the parity of pixels in them. In this note, we demonstrated Rosenblatt's first theorem at work, showed how an elementary perceptron can solve a version of the travel maze problem, and analysed the complexity of that solution. We also constructed a deep network algorithm for the same problem. It is much more efficient. The shallow network uses an exponentially large number of neurons on the hidden layer (Rosenblatt's -elements), whereas for the deep network, the second-order polynomial complexity is sufficient. We demonstrated that for the same complex problem, the deep network can be much smaller and reveal a heuristic behind this effect.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9689667PMC
http://dx.doi.org/10.3390/e24111635DOI Listing

Publication Analysis

Top Keywords

rosenblatt's theorem
12
deep network
12
elementary perceptrons
8
hidden layer
8
elementary perceptron
8
perceptron solve
8
rosenblatt's
4
theorem frugality
4
deep
4
frugality deep
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!