1 results match your criteria: "Georgia Institute of Technology 30332 USA.[Affiliation]"
SIAM J Imaging Sci
November 2020
Department of Electrical and Computer Engineering, Georgia Institute of Technology 30332 USA.
Following the seminal work of Nesterov, accelerated optimization methods have been used to powerfully boost the performance of first-order, gradient based parameter estimation in scenarios where second-order optimization strategies are either inapplicable or impractical. Not only does accelerated gradient descent converge considerably faster than traditional gradient descent, but it also performs a more robust local search of the parameter space by initially overshooting and then oscillating back as it settles into a final configuration, thereby selecting only local minimizers with a basis of attraction large enough to contain the initial overshoot. This behavior has made accelerated and stochastic gradient search methods particularly popular within the machine learning community.
View Article and Find Full Text PDF