Modeling time series of NMR spectra is a useful method to accurately extract information such as temporal concentration profiles from complex processes, e.g. reactions. Modeling these time series by using nonlinear optimization often suffers from high runtimes. On the other hand, using deep learning solves the modeling problem quickly, especially for single spectra with separated peaks. However, the accuracy decreases significantly when peaks overlap or cross. We propose a hybrid approach combining the strengths of both methods while mitigating their drawbacks. This hybrid methods improves on a previous work (Meinhardt et al., 2022) and employs neural networks to predict initial parameters for the optimization algorithm, which only needs to fine-tune the parameters afterwards. We present results for both constructed and experimental data sets and achieve improvements in both runtime and accuracy.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.jmr.2024.107813 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!