AI Article Synopsis

  • Modified Shepard interpolation using second order Taylor series helps create potential energy surfaces effectively in different scenarios.
  • Extending this method to gas-surface dynamics increases complexity and computational demands due to the need for Hessian evaluations.
  • This research shows that employing approximate Hessians with established update formulas, alongside one accurate Hessian, can reduce the overall computational expense.

Article Abstract

Modified Shepard interpolation based on second order Taylor series expansions has proven to be a flexible tool for constructing potential energy surfaces in a range of situations. Extending this to gas-surface dynamics where surface atoms are allowed to move represents a substantial increase in the dimensionality of the problem, reflected in a dramatic increase in the computational cost of the required Hessian (matrix of second derivatives) evaluations. This work demonstrates that using approximate Hessians derived from well known Hessian update formulae and a single accurate Hessian can provide an effective way to avoid this expensive accurate Hessian determination.

Download full-text PDF

Source
http://dx.doi.org/10.1063/1.4868637DOI Listing

Publication Analysis

Top Keywords

hessian update
8
update formulae
8
modified shepard
8
potential energy
8
energy surfaces
8
surface atoms
8
accurate hessian
8
hessian
5
formulae construct
4
construct modified
4

Similar Publications

The overlooked burden of persistent physical symptoms: a call for action in European healthcare.

Lancet Reg Health Eur

January 2025

Department of Psychosomatic Medicine and Psychotherapy, Centre for Internal Medicine, University Medical Centre Hamburg-Eppendorf, Martinistraße 52, Hamburg 20246, Germany.

Regardless of their cause, persistent physical symptoms are distressing somatic complaints that occur on most days for at least several months. They are common in patients with somatic diseases, functional somatic disorders, mental disorders, and undiagnosed medical conditions and are often associated with significant impairment and medical costs. Despite their prevalence and impact, persistent physical symptoms are often overlooked in medical care.

View Article and Find Full Text PDF

How to improve reward sensitivity - Predictors of long-term effects of a randomized controlled online intervention trial.

J Affect Disord

December 2024

Clinical Psychology and Psychotherapy, Department of Psychology, Philipps-University of Marburg, Gutenbergstr. 18, D-35032 Marburg, Germany.

Background: Reward sensitivity is a central maintaining factor of depression. Current treatments fail at sufficiently and reliably modifying reward processing. Therefore, we employed interventions targeting reward sensitivity and evaluated the long-term efficacy of different online interventions, additionally exploring predictors of changes in reward sensitivity.

View Article and Find Full Text PDF

Whole-body electromyostimulation has proven to be a highly effective alternative to conventional resistance-type exercise training. However, due to adverse effects in the past, very extensive contraindications have been put in place for the commercial, non-medical WB-EMS market. Considering recent positive innovations e.

View Article and Find Full Text PDF

Exploiting the Hessian for a Better Convergence of the SCF-RDMFT Procedure.

J Chem Theory Comput

May 2024

Department of Chemistry & Pharmaceutical Sciences and Amsterdam Institute of Molecular and Life Sciences (AIMMS), Faculty of Science, Vrije Universiteit, De Boelelaan 1083, 1081 HV Amsterdam, The Netherlands.

One-body reduced density matrix functional theory provides an alternative to density functional theory, which is able to treat static correlation while keeping a relatively low computation scaling. Its disadvantageous cost comes mainly from a slow convergence of the self-consistent energy optimization. To improve on that problem, we propose in this work the use of the Hessian of the energy, including the coupling term.

View Article and Find Full Text PDF

In this work, we explore the limiting dynamics of deep neural networks trained with stochastic gradient descent (SGD). As observed previously, long after performance has converged, networks continue to move through parameter space by a process of anomalous diffusion in which distance traveled grows as a power law in the number of gradient updates with a nontrivial exponent. We reveal an intricate interaction among the hyperparameters of optimization, the structure in the gradient noise, and the Hessian matrix at the end of training that explains this anomalous diffusion.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!