Using machine learning to improve the hard modeling of NMR time series.

J Magn Reson

Universität Rostock, Institut für Mathematik, 18057 Rostock, Germany; Leibniz-Institut für Katalyse e.V., 18059 Rostock, Germany.

Published: December 2024

Modeling time series of NMR spectra is a useful method to accurately extract information such as temporal concentration profiles from complex processes, e.g. reactions. Modeling these time series by using nonlinear optimization often suffers from high runtimes. On the other hand, using deep learning solves the modeling problem quickly, especially for single spectra with separated peaks. However, the accuracy decreases significantly when peaks overlap or cross. We propose a hybrid approach combining the strengths of both methods while mitigating their drawbacks. This hybrid methods improves on a previous work (Meinhardt et al., 2022) and employs neural networks to predict initial parameters for the optimization algorithm, which only needs to fine-tune the parameters afterwards. We present results for both constructed and experimental data sets and achieve improvements in both runtime and accuracy.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jmr.2024.107813DOI Listing

Publication Analysis

Top Keywords

time series
12
modeling time
8
machine learning
4
learning improve
4
improve hard
4
modeling
4
hard modeling
4
modeling nmr
4
nmr time
4
series modeling
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!