Skip to main content

atomgpt

Project description

AtomGPT & DiffractGPT: atomistic generative pre-trained transformer for forward and inverse materials design

Large language models (LLMs) such as ChatGPT have shown immense potential for various commercial applications, but their applicability for materials design remains underexplored. In this work, AtomGPT is introduced as a model specifically developed for materials design based on transformer architectures, demonstrating capabilities for both atomistic property prediction and structure generation tasks. This study shows that a combination of chemical and structural text descriptions can efficiently predict material properties with accuracy comparable to graph neural network models, including formation energies, electronic bandgaps from two different methods, and superconducting transition temperatures. Furthermore, AtomGPT can generate atomic structures for tasks such as designing new superconductors, with the predictions validated through density functional theory calculations. This work paves the way for leveraging LLMs in forward and inverse materials design, offering an efficient approach to the discovery and optimization of materials.

AtomGPT layer schematic

Both forward and inverse models take a config.json file as an input. Such a config file provides basic training parameters, and an id_prop.csv file path similar to the ALIGNN (https://github.com/usnistgov/alignn) model. See an example here: id_prop.csv.

Installation

First create a conda environment: Install miniforge https://github.com/conda-forge/miniforge

For example:

wget "https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-$(uname)-$(uname -m).sh"

Based on your system requirements, you'll get a file something like 'Miniforge3-XYZ'.

bash Miniforge3-$(uname)-$(uname -m).sh

Now, make a conda environment:

conda create --name my_atomgpt python=3.10 -y
conda activate my_atomgpt
git clone https://github.com/usnistgov/atomgpt.git
cd atomgpt
pip install -e .

Forward model example (structure to property)

Forwards model are used for developing surrogate models for atomic structure to property predictions. It requires text input which can be either the raw POSCAR type files or a text description of the material. After that, we can use Google-T5/ OpenAI GPT2 etc. models with customizing langauage head for accomplishing such a task. The description of a material is generated with ChemNLP/describer function. If you turn convert to False, you can also train on bare POSCAR files.

For training:

python atomgpt/forward_models/forward_models.py --config_name atomgpt/examples/forward_model/config.json

or use atomgpt_forward_train global executable.

For inference:

python atomgpt/forward_models/forward_predict.py --output_dir out --pred_csv atomgpt/examples/forward_model/pred_list_forward.csv

or use atomgpt_forward_predict global executable.

Inverse model example (property to structure)

Inverse models are used for generating materials given property and description such as chemical formula. Currently, we use Mistral model, but other models such as Gemma, Lllama etc. can also be easily used. After the structure generation, we can optimize the structure with ALIGNN-FF model (example here and then subject to density functional theory calculations for a few selected candidates using JARVIS-DFT or similar workflow (tutorial for example here. Note that currently, the inversely model training as well as conference requires GPUs.

For training:

python atomgpt/inverse_models/inverse_models.py --config_name atomgpt/examples/inverse_model/config.json

or use atomgpt_inverse_train global executable.

For inference:

python atomgpt/inverse_models/inverse_predict.py --output_dir outputs/ --pred_csv "atomgpt/examples/inverse_model/pred_list_inverse.csv"

or use atomgpt_inverse_predict global executable.

DiffractGPT model example (spectral property to structure)

Inverse models are also used for generating materials given spectra/multi value property such as X-ray diffraction and description such as chemical formula.

For training:

python atomgpt/inverse_models/inverse_models.py --config_name atomgpt/examples/inverse_model_multi/config.json

For inference:

python atomgpt/inverse_models/inverse_predict.py --output_dir outputs_xrd --pred_csv atomgpt/examples/inverse_model_multi/pred_list_inverse.csv

or if you want to use the original model:

python atomgpt/inverse_models/inverse_predict.py --output_dir atomgpt/examples/inverse_model_multi --pred_csv atomgpt/examples/inverse_model_multi/pred_list_inverse.csv

Example inference only case:

Make a tmp/pred_list.csv

LaB6.dat

You can add multiple .dat file with 2theta, intentisty values in this csv file.

Then add a tmp/config.json

{
    "id_prop_path": "atomgpt/examples/inverse_model_multi/id_prop.csv",
    "prefix": "atomgpt_run",
    "model_name": "knc6/diffractgpt_mistral_chemical_formula",
    "batch_size": 2,
    "num_epochs": 2,
    "logging_steps": 1,
    "dataset_num_proc": 2,
    "seed_val": 3407,
    "learning_rate": 0.0002,
    "per_device_train_batch_size": 2,
    "gradient_accumulation_steps": 4,
    "num_train": 2,
    "num_val": 0,
    "num_test": 2,
    "model_save_path": "",
    "loss_type": "default",
    "optim": "adamw_8bit",
    "lr_scheduler_type": "linear",
    "output_dir": "outputs_xrd",
    "csv_out": "AI-AtomGen-prop-dft_3d-test-rmse.csv",
    "chem_info": "formula",
    "max_seq_length": 2048,
    "prop": "XRD",
    "dtype": null,
    "load_in_4bit": true,
    "instruction": "Below is a description of a material.",
    "alpaca_prompt": "### Instruction:\n{}\n### Input:\n{}\n### Output:\n{}",
    "output_prompt": " Generate atomic structure description with lattice lengths, angles, coordinates and atom types."
}

This data was generated with example script: atomgpt/scripts/gen_data.py

python atomgpt/inverse_models/inverse_predict.py --output_dir atomgpt/examples/inverse_model_multi/tmp  --pred_csv atomgpt/examples/inverse_model_multi/tmp/pred_list.csv

More detailed examples/case-studies would be added here soon.

Google colab/Jupyter notebook

Notebooks Google Colab Descriptions
Forward Model training Open in Google Colab Example of forward model training for exfoliation energy.
Inverse Model training Open in Google Colab Example of installing AtomGPT, inverse model training for 5 sample materials, using the trained model for inference, relaxing structures with ALIGNN-FF, generating a database of atomic structures.
HuggingFace AtomGPT model inference Open in Google Colab AtomGPT Structure Generation/Inference example with a model hosted on Huggingface.
Inverse Model DiffractGPT inference Open in Google Colab Example of predicting crystal structure from X-ray diffraction data.

For similar other notebook examples, see JARVIS-Tools-Notebook Collection

HuggingFace link :hugs:

https://huggingface.co/knc6

Referenes:

  1. AtomGPT: Atomistic Generative Pretrained Transformer for Forward and Inverse Materials Design
  2. DiffractGPT: Atomic Structure Determination from X-ray Diffraction Patterns using Generative Pre-trained Transformer
  3. ChemNLP: A Natural Language Processing based Library for Materials Chemistry Text Data
  4. JARVIS-Leaderboard
  5. NIST-JARVIS Infrastructure
  6. Unsloth AI

How to contribute

For detailed instructions, please see Contribution instructions

Correspondence

Please report bugs as Github issues (https://github.com/usnistgov/atomgpt/issues) or email to kamal.choudhary@nist.gov.

Funding support

NIST-MGI (https://www.nist.gov/mgi) and CHIPS (https://www.nist.gov/chips)

Code of conduct

Please see Code of conduct

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

atomgpt-2024.11.30.tar.gz (100.4 kB view details)

Uploaded Source

Built Distribution

atomgpt-2024.11.30-py3-none-any.whl (111.0 kB view details)

Uploaded Python 3

File details

Details for the file atomgpt-2024.11.30.tar.gz.

File metadata

  • Download URL: atomgpt-2024.11.30.tar.gz
  • Upload date:
  • Size: 100.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.16

File hashes

Hashes for atomgpt-2024.11.30.tar.gz
Algorithm Hash digest
SHA256 9139856c851a84011b4e040cfcb827d59a870e5dd05e754f6bf4e980e27e53bc
MD5 b6d5692e7ac86446e46222ada73e6bb3
BLAKE2b-256 97e55fb801b55dc6e9f9b5c78ad513f5e34974ef9f478e3a1e06547c3357e54b

See more details on using hashes here.

File details

Details for the file atomgpt-2024.11.30-py3-none-any.whl.

File metadata

  • Download URL: atomgpt-2024.11.30-py3-none-any.whl
  • Upload date:
  • Size: 111.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.16

File hashes

Hashes for atomgpt-2024.11.30-py3-none-any.whl
Algorithm Hash digest
SHA256 2740ee5680014d87966203d92432df24cbb05909d2539a39c3d38f45cdd10572
MD5 0a741027becb88018d09bf86c9e495c3
BLAKE2b-256 92244df9f4831b70c9766f7b10cbb38565811ae48a3ce365cc94c5cb778a7ee6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page