HANNA: hard-constraint neural network for consistent activity coefficient prediction.

Chem Sci

Laboratory of Engineering Thermodynamics (LTD), RPTU Kaiserslautern Germany

Published: December 2024

We present the first hard-constraint neural network model for predicting activity coefficients (HANNA), a thermodynamic mixture property that is the basis for many applications in science and engineering. Unlike traditional neural networks, which ignore physical laws and result in inconsistent predictions, our model is designed to strictly adhere to all thermodynamic consistency criteria. By leveraging deep-set neural networks, HANNA maintains symmetry under the permutation of the components. Furthermore, by hard-coding physical constraints in the model architecture, we ensure consistency with the Gibbs-Duhem equation and in modeling the pure components. The model was trained and evaluated on 317 421 data points for activity coefficients in binary mixtures from the Dortmund Data Bank, achieving significantly higher prediction accuracies than the current state-of-the-art model UNIFAC. Moreover, HANNA only requires the SMILES of the components as input, making it applicable to any binary mixture of interest. HANNA is fully open-source and available for free use.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11575590PMC
http://dx.doi.org/10.1039/d4sc05115gDOI Listing

Publication Analysis

Top Keywords

hard-constraint neural
8
neural network
8
activity coefficients
8
neural networks
8
hanna
5
model
5
hanna hard-constraint
4
neural
4
network consistent
4
consistent activity
4

Similar Publications

HANNA: hard-constraint neural network for consistent activity coefficient prediction.

Chem Sci

December 2024

Laboratory of Engineering Thermodynamics (LTD), RPTU Kaiserslautern Germany

We present the first hard-constraint neural network model for predicting activity coefficients (HANNA), a thermodynamic mixture property that is the basis for many applications in science and engineering. Unlike traditional neural networks, which ignore physical laws and result in inconsistent predictions, our model is designed to strictly adhere to all thermodynamic consistency criteria. By leveraging deep-set neural networks, HANNA maintains symmetry under the permutation of the components.

View Article and Find Full Text PDF

Limited angle reconstruction is a typical ill-posed problem in computed tomography (CT). Given incomplete projection data, images reconstructed by conventional analytical algorithms and iterative methods suffer from severe structural distortions and artifacts. In this paper, we proposed a self-augmented multi-stage deep-learning network (Sam's Net) for end-to-end reconstruction of limited angle CT.

View Article and Find Full Text PDF

Manifold learning using deep neural networks been shown to be an effective tool for building sophisticated prior image models that can be applied to noise reduction in low-close CT. A manifold is a low-dimensional space that captures most of the variation in a class of data (e.g.

View Article and Find Full Text PDF

Cerebral palsy (CP) is predominantly a disorder of movement, with evidence of sensory-motor dysfunction. CIMT is a widely used treatment for hemiplegic CP. However, effects of CIMT on somatosensory processing remain unclear.

View Article and Find Full Text PDF

Nonnegative matrix factorization (NMF) is well known to be an effective tool for dimensionality reduction in problems involving big data. For this reason, it frequently appears in many areas of scientific and engineering literature. This letter proposes a novel semisupervised NMF algorithm for overcoming a variety of problems associated with NMF algorithms, including poor use of prior information, negative impact on manifold structure of the sparse constraint, and inaccurate graph construction.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!