Artificial intelligence (AI) and machine learning (ML) have been increasingly used in materials science to build predictive models and accelerate discovery. For selected properties, availability of large databases has also facilitated application of deep learning (DL) and transfer learning (TL). However, unavailability of large datasets for a majority of properties prohibits widespread application of DL/TL. We present a cross-property deep-transfer-learning framework that leverages models trained on large datasets to build models on small datasets of different properties. We test the proposed framework on 39 computational and two experimental datasets and find that the TL models with only elemental fractions as input outperform ML/DL models trained from scratch even when they are allowed to use physical attributes as input, for 27/39 (≈ 69%) computational and both the experimental datasets. We believe that the proposed framework can be widely useful to tackle the small data challenge in applying AI/ML in materials science.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8594437PMC
http://dx.doi.org/10.1038/s41467-021-26921-5DOI Listing

Publication Analysis

Top Keywords

transfer learning
8
materials science
8
large datasets
8
models trained
8
proposed framework
8
computational experimental
8
experimental datasets
8
models
5
datasets
5
cross-property deep
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!