Good old online backpropagation for plain multilayer perceptrons yields a very low 0.35% error rate on the MNIST handwritten digits benchmark. All we need to achieve this best result so far are many hidden layers, many neurons per layer, numerous deformed training images to avoid overfitting, and graphics cards to greatly speed up learning.

Download full-text PDF

Source
http://dx.doi.org/10.1162/NECO_a_00052DOI Listing

Publication Analysis

Top Keywords

deep big
4
big simple
4
simple neural
4
neural nets
4
nets handwritten
4
handwritten digit
4
digit recognition
4
recognition good
4
good online
4
online backpropagation
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!