Mirror Descent of Hopfield Model.

Neural Comput

Department of Physics Education, Department of Physics and Astronomy, and Center for Theoretical Physics and Artificial Intelligence Institute.

Published: August 2023

Mirror descent is an elegant optimization technique that leverages a dual space of parametric models to perform gradient descent. While originally developed for convex optimization, it has increasingly been applied in the field of machine learning. In this study, we propose a novel approach for using mirror descent to initialize the parameters of neural networks. Specifically, we demonstrate that by using the Hopfield model as a prototype for neural networks, mirror descent can effectively train the model with significantly improved performance compared to traditional gradient descent methods that rely on random parameter initialization. Our findings highlight the potential of mirror descent as a promising initialization technique for enhancing the optimization of machine learning models.

Download full-text PDF

Source
http://dx.doi.org/10.1162/neco_a_01602DOI Listing

Publication Analysis

Top Keywords

mirror descent
20
hopfield model
8
gradient descent
8
machine learning
8
neural networks
8
descent
6
mirror
5
descent hopfield
4
model mirror
4
descent elegant
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!