Connections between physics, mathematics, and deep learning.

Lett High Energy Phys

NCBI, National Library of Medicine, National Institutes of Health, 8600 Rockville Pike, Bethesda, MD 20894, USA.

Published: August 2019

Starting from Fermat's principle of least action, which governs classical and quantum mechanics and from the theory of exterior differential forms, which governs the geometry of curved manifolds, we show how to derive the equations governing neural networks in an intrinsic, coordinate-invariant way, where the loss function plays the role of the Hamiltonian. To be covariant, these equations imply a layer metric which is instrumental in pretraining and explains the role of conjugation when using complex numbers. The differential formalism clarifies the relation of the gradient descent optimizer with Aristotelian and Newtonian mechanics. The Bayesian paradigm is then analyzed as a renormalizable theory yielding a new derivation of the Bayesian information criterion. We hope that this formal presentation of the differential geometry of neural networks will encourage some physicists to dive into deep learning and, reciprocally, that the specialists of deep learning will better appreciate the close interconnection of their subject with the foundations of classical and quantum field theory.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8462849PMC
http://dx.doi.org/10.31526/lhep.3.2019.110DOI Listing

Publication Analysis

Top Keywords

deep learning
12
classical quantum
8
neural networks
8
connections physics
4
physics mathematics
4
mathematics deep
4
learning starting
4
starting fermat's
4
fermat's principle
4
principle action
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!