A simple remedy for failure modes in physics informed neural networks.

Neural Netw

Western University, Department of Computer Science, 1151 Richmond St, Middlesex College, London, N6A 5B7, Canada; Vector Institute, Toronto, 661 University Ave Suite 710, M5G 1M1, Ontario, Canada. Electronic address:

Published: December 2024

Physics-informed neural networks (PINNs) have shown promising results in solving a wide range of problems involving partial differential equations (PDEs). Nevertheless, there are several instances of the failure of PINNs when PDEs become more complex. Particularly, when PDE coefficients grow larger or PDEs become increasingly nonlinear, PINNs struggle to converge to the true solution. A noticeable discrepancy emerges in the convergence speed between the PDE loss and the initial/boundary conditions loss, leading to the inability of PINNs to effectively learn the true solutions to these PDEs. In the present work, leveraging the neural tangent kernels (NTKs), we investigate the training dynamics of PINNs. Our theoretical analysis reveals that when PINNs are trained using gradient descent with momentum (GDM), the gap in convergence rates between the two loss terms is significantly reduced, thereby enabling the learning of the exact solution. We also examine why training a model via the Adam optimizer can accelerate the convergence and reduce the effect of the mentioned discrepancy. Our numerical experiments validate that sufficiently wide networks trained with GDM and Adam yield desirable solutions for more complex PDEs.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2024.106963DOI Listing

Publication Analysis

Top Keywords

neural networks
8
pinns
6
pdes
5
simple remedy
4
remedy failure
4
failure modes
4
modes physics
4
physics informed
4
informed neural
4
networks physics-informed
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!