PyTorch implementation of So3krates neural network potential for atomistic simulations
Project description
So3krates-torch
[!IMPORTANT] The code is work in progress! There may be breaking changes!
Lightweight implementation of the So3krates model in pytorch. This package is mostly intended for aims-PAX but is a functional implementation of So3krates and SO3LR in pytorch. For now it uses (modified) source code of the MACE package and follows its style, so many functions are actually compatible.
Installation
- activate your environment
- clone this repository
- move to the clone repository
pip install -r requirements.xtpip install .
Implemented features:
- ASE calculator for MD (including pre-trained SO3LR)
- Inference over ase readable datasets:
torchkrates-eval - Error metrics over ase readable datasets:
torchkrates-test - Transforming pyTorch and JAX parameter formates:
torchkrates-jax2torchortorchkrates-torch2jax - Training is WIP but
trainintools.trainis already working so you can easily build your own script
[!IMPORTANT] Number 4 means that you can transform the weights from this pytorch version into the JAX version and vice versa. Inference and training is much faster (at least 1 order of magnitude) in the JAX version. This implementation is mostly for prototyping and compatability with other packages.
TODO
- training
- hirshfeld loss
- load trainings params from yaml
- script
- finetuning
- save and load hyperparameter json from torchkrates
- enable torch.script (important for openmm)
Cite
If you are using the models implemented here please cite:
@article{kabylda2024molecular,
title={Molecular Simulations with a Pretrained Neural Network and Universal Pairwise Force Fields},
author={Kabylda, A. and Frank, J. T. and Dou, S. S. and Khabibrakhmanov, A. and Sandonas, L. M.
and Unke, O. T. and Chmiela, S. and M{\"u}ller, K.R. and Tkatchenko, A.},
journal={ChemRxiv},
year={2024},
doi={10.26434/chemrxiv-2024-bdfr0-v2}
}
@article{frank2024euclidean,
title={A Euclidean transformer for fast and stable machine learned force fields},
author={Frank, Thorben and Unke, Oliver and M{\"u}ller, Klaus-Robert and Chmiela, Stefan},
journal={Nature Communications},
volume={15},
number={1},
pages={6539},
year={2024}
}
Also consider citing MACE, as this software heavlily leans on or uses its code:
@inproceedings{Batatia2022mace,
title={{MACE}: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields},
author={Ilyes Batatia and David Peter Kovacs and Gregor N. C. Simm and Christoph Ortner and Gabor Csanyi},
booktitle={Advances in Neural Information Processing Systems},
editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
year={2022},
url={https://openreview.net/forum?id=YPpSngE-ZU}
}
@misc{Batatia2022Design,
title = {The Design Space of E(3)-Equivariant Atom-Centered Interatomic Potentials},
author = {Batatia, Ilyes and Batzner, Simon and Kov{\'a}cs, D{\'a}vid P{\'e}ter and Musaelian, Albert and Simm, Gregor N. C. and Drautz, Ralf and Ortner, Christoph and Kozinsky, Boris and Cs{\'a}nyi, G{\'a}bor},
year = {2022},
number = {arXiv:2205.06643},
eprint = {2205.06643},
eprinttype = {arxiv},
doi = {10.48550/arXiv.2205.06643},
archiveprefix = {arXiv}
}
Contact
If you have questions you can reach me at: tobias.henkes@uni.lu
For bugs or feature requests, please use GitHub Issues.
License
The code is published and distributed under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file so3krates_torch-0.1.tar.gz.
File metadata
- Download URL: so3krates_torch-0.1.tar.gz
- Upload date:
- Size: 2.1 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
58c1741042a893ee50a1f864778060434ef3dbc0c05cba5e45ee344edcac423d
|
|
| MD5 |
94bfe55abdd8f74c0dfa5ddeccbe210f
|
|
| BLAKE2b-256 |
276a4a8aefe705f44631a05da7c3b7de366cd156079c8b2ff95252ab3c3a15eb
|
File details
Details for the file so3krates_torch-0.1-py3-none-any.whl.
File metadata
- Download URL: so3krates_torch-0.1-py3-none-any.whl
- Upload date:
- Size: 2.1 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
97766a87c787341d925ed4b3fbaafda6eee05045fa3219c62bf48c7716254ff8
|
|
| MD5 |
c12647b7119c283ca7117518a5109e11
|
|
| BLAKE2b-256 |
7bc15437191c55d23ff642903dea884591076dd1141e14896abbd704b618c6ae
|