Approximating nonlinear differential equations using a neural network provides a robust and efficient tool for various scientific computing tasks, including real-time predictions, inverse problems, optimal controls, and surrogate modeling. Previous works have focused on embedding dynamical systems into networks through two approaches: learning a single operator (i.e., the mapping from input parameterised functions to solutions) or learning the governing system of equations (i.e., the constitutive model relative to the state variables). Both of these approaches yield different representations for the same underlying data or function. Observing that families of differential equations often share key characteristics, we seek one network representation across a wide range of equations. Our multimodality approach, called Predicting Multiple Operators and Symbolic Expressions (PROSE), is capable of constructing multi-operators and governing equations simultaneously through a novel fusion structure. In particular, PROSE solves differential equations, predicts future states, and generates the underlying equations of motion by incorporating symbolic "words" through a language model. Experiments with 25600 distinct equations show that PROSE benefits from its multimodal nature, resulting in robust generalization (e.g. noisy observations, equation misspecification, and data imbalance) supported by comparison and ablation studies. PROSE provides a new operator learning framework that incorporates multimodal input/output and language models for solving forward and inverse problems related to differential equations.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.neunet.2024.106707 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!