Skip to main content

Access pre-trained MACE models

Project description

ZnTrack PyPI version

MACE models

Effortlessly integrate pre-trained MACE models into your projects with the user-friendly mace-models package.

pip install mace-models

Loading models is simple with a few lines of code:

import mace_models

# Load a default MACE model
model = mace_models.load()
torch_model = model.get_model()
ase_calculator = model.get_calculator()
# get the model file 
model.get_file()

Customize your model selection by loading models from various repositories or specific revisions:

model = mace_models.load(
    "<model-name>",
    rev="<branch-or-sha>",
    remote="https://github.com/<user>/<repo>"
)

[!TIP] If you want to persist the models locally, use

git clone https://github.com/RokasEl/MACE-Models
cd MACE-Models
dvc pull

and use mace_models.load(remote="/path/to/MACE-Models")

Models

The following models are available under the MIT license:

Example usage with ASE

import mace_models
from ase.build import molecule

model = mace_models.load("MACE-MP-0_small")

water = molecule("H2O")
water.calc = model.get_calculator(dtype="float64")

print(water.get_potential_energy())
>>> -14.047933366728305

References

@article{kovacsEvaluationMACEForce2023,
  title = {Evaluation of the {{MACE}} Force Field Architecture: {{From}} Medicinal Chemistry to Materials Science},
  shorttitle = {Evaluation of the {{MACE}} Force Field Architecture},
  author = {Kov{\'a}cs, D{\'a}vid P{\'e}ter and Batatia, Ilyes and Arany, Eszter S{\'a}ra and Cs{\'a}nyi, G{\'a}bor},
  year = {2023},
  journal = {The Journal of Chemical Physics},
  volume = {159},
  number = {4},
  pages = {044118},
  issn = {0021-9606},
  doi = {10.1063/5.0155322},
  urldate = {2024-01-17}
}
@misc{batatiaFoundationModelAtomistic2023,
  title = {A Foundation Model for Atomistic Materials Chemistry},
  author = {Batatia, Ilyes and Benner, Philipp and Chiang, Yuan and Elena, Alin M. and Kov{\'a}cs, D{\'a}vid P. and Riebesell, Janosh and Advincula, Xavier R. and Asta, Mark and Baldwin, William J. and Bernstein, Noam and Bhowmik, Arghya and Blau, Samuel M. and C{\u a}rare, Vlad and Darby, James P. and De, Sandip and Della Pia, Flaviano and Deringer, Volker L. and Elijo{\v s}ius, Rokas and {El-Machachi}, Zakariya and Fako, Edvin and Ferrari, Andrea C. and {Genreith-Schriever}, Annalena and George, Janine and Goodall, Rhys E. A. and Grey, Clare P. and Han, Shuang and Handley, Will and Heenen, Hendrik H. and Hermansson, Kersti and Holm, Christian and Jaafar, Jad and Hofmann, Stephan and Jakob, Konstantin S. and Jung, Hyunwook and Kapil, Venkat and Kaplan, Aaron D. and Karimitari, Nima and Kroupa, Namu and Kullgren, Jolla and Kuner, Matthew C. and Kuryla, Domantas and Liepuoniute, Guoda and Margraf, Johannes T. and Magd{\u a}u, Ioan-Bogdan and Michaelides, Angelos and Moore, J. Harry and Naik, Aakash A. and Niblett, Samuel P. and Norwood, Sam Walton and O'Neill, Niamh and Ortner, Christoph and Persson, Kristin A. and Reuter, Karsten and Rosen, Andrew S. and Schaaf, Lars L. and Schran, Christoph and Sivonxay, Eric and Stenczel, Tam{\'a}s K. and Svahn, Viktor and Sutton, Christopher and {van der Oord}, Cas and {Varga-Umbrich}, Eszter and Vegge, Tejs and Vondr{\'a}k, Martin and Wang, Yangshuai and Witt, William C. and Zills, Fabian and Cs{\'a}nyi, G{\'a}bor},
  year = {2023},
  number = {arXiv:2401.00096},
  eprint = {2401.00096},
  primaryclass = {cond-mat, physics:physics},
  publisher = {{arXiv}},
  doi = {10.48550/arXiv.2401.00096},
  urldate = {2024-01-17},
  archiveprefix = {arxiv}
}

MACE is described in

@misc{batatiaMACEHigherOrder2022,
  title = {{{MACE}}: {{Higher Order Equivariant Message Passing Neural Networks}} for {{Fast}} and {{Accurate Force Fields}}},
  shorttitle = {{{MACE}}},
  author = {Batatia, Ilyes and Kov{\'a}cs, D{\'a}vid P{\'e}ter and Simm, Gregor N. C. and Ortner, Christoph and Cs{\'a}nyi, G{\'a}bor},
  year = {2022},
  number = {arXiv:2206.07697},
  eprint = {2206.07697},
  primaryclass = {cond-mat, physics:physics, stat},
  publisher = {{arXiv}},
  urldate = {2022-06-19},
  archiveprefix = {arxiv},
  langid = {english}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mace_models-0.1.6.tar.gz (4.8 kB view details)

Uploaded Source

Built Distribution

mace_models-0.1.6-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file mace_models-0.1.6.tar.gz.

File metadata

  • Download URL: mace_models-0.1.6.tar.gz
  • Upload date:
  • Size: 4.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.5 Darwin/24.1.0

File hashes

Hashes for mace_models-0.1.6.tar.gz
Algorithm Hash digest
SHA256 5282886d27bb712682ef1fc24a96fb964fd6d1e233bad66a349c6f1d55bf221d
MD5 c4239b8c28b0f9a27715c457ef046935
BLAKE2b-256 f3b90cfc56d67730356ffc15069a0076a595dd2e4d6cfd65feba5cfefd85996d

See more details on using hashes here.

File details

Details for the file mace_models-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: mace_models-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 5.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.5 Darwin/24.1.0

File hashes

Hashes for mace_models-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 12458260bc2dfd68acc21cea4085c12d63d6274ba694b794bd40045325b1d4f5
MD5 3f220e8bc0cee19045f40cb7dafd7c0b
BLAKE2b-256 2fc24fc482eae5f93cd12850a3b35806a1b0753fb97d7ca0581fd6c2af9785ee

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page