Elliptic PDE learning is provably data-efficient.

Proc Natl Acad Sci U S A

Mathematics Department, Cornell University, Ithaca, NY 14853-4201.

Published: September 2023

Partial differential equations (PDE) learning is an emerging field that combines physics and machine learning to recover unknown physical systems from experimental data. While deep learning models traditionally require copious amounts of training data, recent PDE learning techniques achieve spectacular results with limited data availability. Still, these results are empirical. Our work provides theoretical guarantees on the number of input-output training pairs required in PDE learning. Specifically, we exploit randomized numerical linear algebra and PDE theory to derive a provably data-efficient algorithm that recovers solution operators of three-dimensional uniformly elliptic PDEs from input-output data and achieves an exponential convergence rate of the error with respect to the size of the training dataset with an exceptionally high probability of success.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10523644PMC
http://dx.doi.org/10.1073/pnas.2303904120DOI Listing

Publication Analysis

Top Keywords

pde learning
16
provably data-efficient
8
learning
6
elliptic pde
4
learning provably
4
data-efficient partial
4
partial differential
4
differential equations
4
pde
4
equations pde
4

Similar Publications

Problem: In Australia, program accreditation requirements include that education providers monitor and evaluate teaching and learning environments and provide evidence of outcomes being used to inform program quality improvement. Yet, closing this loop has proven challenging.

Background: The Australian National Placement Evaluation Centre (NPEC) functions to measure the quality of placements through student evaluations.

View Article and Find Full Text PDF

Phosphodiesterase (PDE) enzymes regulate intracellular signaling pathways crucial for brain development and the pathophysiology of neurological disorders. Among the 11 PDE subtypes, PDE4 and PDE5 are particularly significant due to their regulation of cyclic adenosine monophosphate (cAMP) and cyclic guanosine monophosphate (cGMP) signaling, respectively, which are vital for learning, memory, and neuroprotection. This review synthesizes current evidence on the roles of PDE4 and PDE5 in neurological health and disease, focusing on their regulation of second messenger pathways and their implications for brain function.

View Article and Find Full Text PDF

Implementing the discontinuous-Galerkin finite element method using graph neural networks with application to diffusion equations.

Neural Netw

December 2024

Department of Earth Science and Engineering, Imperial College London, Prince Consort Road, London SW7 2BP, UK; Centre for AI-Physics Modelling, Imperial-X, White City Campus, Imperial College London, W12 7SL, UK.

Machine learning (ML) has benefited from both software and hardware advancements, leading to increasing interest in capitalising on ML throughout academia and industry. There have been efforts in the scientific computing community to leverage this development via implementing conventional partial differential equation (PDE) solvers with machine learning packages, most of which rely on structured spatial discretisation and fast convolution algorithms. However, unstructured meshes are favoured in problems with complex geometries.

View Article and Find Full Text PDF

Synergistic learning with multi-task DeepONet for efficient PDE problem solving.

Neural Netw

January 2025

School of Engineering, Brown University, United States of America; Division of Applied Mathematics, Brown University, United States of America. Electronic address:

Multi-task learning (MTL) is an inductive transfer mechanism designed to leverage useful information from multiple tasks to improve generalization performance compared to single-task learning. It has been extensively explored in traditional machine learning to address issues such as data sparsity and overfitting in neural networks. In this work, we apply MTL to problems in science and engineering governed by partial differential equations (PDEs).

View Article and Find Full Text PDF

A simple remedy for failure modes in physics informed neural networks.

Neural Netw

March 2025

Western University, Department of Computer Science, 1151 Richmond St, Middlesex College, London, N6A 5B7, Canada; Vector Institute, Toronto, 661 University Ave Suite 710, M5G 1M1, Ontario, Canada. Electronic address:

Physics-informed neural networks (PINNs) have shown promising results in solving a wide range of problems involving partial differential equations (PDEs). Nevertheless, there are several instances of the failure of PINNs when PDEs become more complex. Particularly, when PDE coefficients grow larger or PDEs become increasingly nonlinear, PINNs struggle to converge to the true solution.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!