Generative Pretrained Transformer for Heterogeneous Catalysts.

J Am Chem Soc

Department of Chemical and Biomolecular Engineering, Institute of Emergent Materials, Sogang University, Seoul 04107, Republic of Korea.

Published: December 2024

Discovery of novel and promising materials is a critical challenge in the field of chemistry and material science, traditionally approached through methodologies ranging from trial-and-error to machine-learning-driven inverse design. Recent studies suggest that transformer-based language models can be utilized as material generative models to expand the chemical space and explore materials with desired properties. In this work, we introduce the catalyst generative pretrained transformer (CatGPT), trained to generate string representations of inorganic catalyst structures from a vast chemical space. CatGPT not only demonstrates high performance in generating valid and accurate catalyst structures but also serves as a foundation model for generating the desired types of catalysts by text-conditioning and fine-tuning. As an example, we fine-tuned the pretrained CatGPT using a binary alloy catalyst data set designed for screening two-electron oxygen reduction reaction (2e-ORR) catalyst and generated catalyst structures specialized for 2e-ORR. Our work demonstrates the potential of generative language models as generative tools for catalyst discovery.

Download full-text PDF

Source
http://dx.doi.org/10.1021/jacs.4c11504DOI Listing

Publication Analysis

Top Keywords

catalyst structures
12
generative pretrained
8
pretrained transformer
8
language models
8
chemical space
8
catalyst
7
generative
5
transformer heterogeneous
4
heterogeneous catalysts
4
catalysts discovery
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!