MLIP plugins for Gaussian16 External (UMA, ORB, MACE, AIMNet2)
Project description
g16-mlips
MLIP (Machine Learning Interatomic Potential) plugins for Gaussian 16 External interface.
Four model families are currently supported:
- UMA (fairchem) — default model:
uma-s-1p1 - ORB (orb-models) — default model:
orb_v3_conservative_omol - MACE (mace) — default model:
MACE-OMOL-0 - AIMNet2 (aimnetcentral) — default model:
aimnet2
All backends provide energy, gradient, and analytical Hessian for Gaussian 16.
An optional implicit-solvent correction (xTB) is also available via --solvent.
The model server starts automatically and stays resident, so repeated calls during optimization are fast.
Requires Python 3.9 or later.
If you use ORCA, see also: https://github.com/t-0hmura/orca-mlips
Quick Start (Default = UMA)
- Install PyTorch suitable for your CUDA environment.
pip install torch==2.8.0 --index-url https://download.pytorch.org/whl/cu129
- Install the package with the UMA profile. If you need ORB/MACE/AIMNet2, use
g16-mlips[orb]/g16-mlips[mace]/g16-mlips[aimnet2].
pip install "g16-mlips[uma]"
- Log in to Hugging Face for UMA model access. (Not required for ORB/MACE/AIMNet2)
huggingface-cli login
UMA model is on Hugging Face Hub. You need to log in once (See https://github.com/facebookresearch/fairchem):
- Use in a Gaussian input file (
nomicrois required). If you use ORB/MACE/AIMNet2, useexternal="orb"/external="mace"/external="aimnet2". For detailed GaussianExternalusage, see https://gaussian.com/external/
%nprocshared=8
%mem=32GB
%chk=water_ext.chk
#p external="uma" opt(nomicro)
Water external UMA example
0 1
O 0.000000 0.000000 0.000000
H 0.758602 0.000000 0.504284
H -0.758602 0.000000 0.504284
Other backends:
#p external="orb" opt(nomicro)
#p external="mace" opt(nomicro)
#p external="aimnet2" opt(nomicro)
Important: For Gaussian
Externalgeometry optimization, always includenomicroinopt(...). Without it, Gaussian uses micro-iterations that rely on an internal gradient routine, which is incompatible with the external interface. ONIOM is supported with bothActiveAtomsandAllAtoms. WithAllAtoms, Gaussian passesIAn=0MM point-charge rows to the plugin. By default theseIAn=0rows are excluded from MLIP evaluation and returned as zero force/Hessian blocks.
For ONIOM point-charge embedding correction via xTB:
#p oniom(external=("uma --embedcharge",AllAtoms):amber=softfirst) opt(nomicro)
--embedcharge adds an xTB point-charge embedding correction using ONIOM MM charges.
When igrd=2, it also returns MM point-charge force/Hessian terms from the embedding correction.
Analytical Hessian (optional)
Optimization and IRC can run without providing an initial Hessian — Gaussian builds one internally using estimated force constants. Providing an MLIP analytical Hessian via freq + readfc improves convergence, especially for TS searches.
Gaussian freq (with external=...) is the only job type that requests the plugin's analytical Hessian directly.
Frequency calculation
%nprocshared=8
%mem=32GB
%chk=cla_ext.chk
#p external="uma" freq
CLA freq UMA
0 1
...
Gaussian sends igrd=2 and stores the result in the .chk file.
Implicit Solvent Correction (xTB)
You can use an implicit-solvent correction via xTB. To use it, install xTB and pass the --solvent option to external.
Install xTB in your conda environment (or built from source):
conda install xtb
Use --solvent <name> in external="..." (examples: water, thf):
#p external="uma --solvent water" opt(nomicro)
#p external="uma --solvent thf" freq
For details, see SOLVENT_EFFECTS.md.
This implementation follows the solvent-correction approach described in: Zhang, C., Leforestier, B., Besnard, C., & Mazet, C. (2025). Pd-catalyzed regiodivergent arylation of cyclic allylboronates. Chemical Science, 16, 22656-22665. https://doi.org/10.1039/d5sc07577g
If citing this correction in a paper, you can use the following:
Implicit solvent effects were accounted for by integrating the ALPB [or CPCM-X] solvation model from the xtb package as an additional correction to UMA-generated energies, gradients, and Hessians.
Note:
--solvent-model cpcmx(CPCM-X) requires xTB built from source with-DWITH_CPCMX=ON. The conda-forgextbpackage does not include CPCM-X support. SeeSOLVENT_EFFECTS.mdfor build instructions.
Using the analytical Hessian in optimization jobs
To use the MLIP analytical Hessian in opt/irc, read the Hessian from an existing checkpoint using Gaussian %oldchk + readfc.
%nprocshared=8
%mem=32GB
%chk=cla_ext.chk
%oldchk=cla_ext.chk
#p external="uma" opt(readfc,nomicro)
CLA opt UMA
0 1
...
readfc reads the force constants from %oldchk. This applies to opt and irc runs.
Note that freq is the only job type that requests the analytical Hessian (igrd=2) from the plugin. opt and irc themselves never request it directly.
Installing Model Families
pip install "g16-mlips[uma]" # UMA (default)
pip install "g16-mlips[orb]" # ORB
pip install "g16-mlips[mace]" # MACE
pip install "g16-mlips[orb,mace]" # ORB + MACE
pip install "g16-mlips[aimnet2]" # AIMNet2
pip install "g16-mlips[orb,mace,aimnet2]" # ORB + MACE + AIMNet2
pip install g16-mlips # core only
Note: UMA and MACE have a dependency conflict (
e3nn). Use separate environments.
Local install:
git clone https://github.com/t-0hmura/g16-mlips.git
cd g16-mlips
pip install ".[uma]"
Model download notes:
- UMA: Hosted on Hugging Face Hub. Run
huggingface-cli loginonce. - ORB / MACE / AIMNet2: Downloaded automatically on first use.
Upstream Model Sources
- UMA / FAIR-Chem: https://github.com/facebookresearch/fairchem
- ORB / orb-models: https://github.com/orbital-materials/orb-models
- MACE: https://github.com/ACEsuit/mace
- AIMNet2: https://github.com/isayevlab/aimnetcentral
Advanced Options
See OPTIONS.md for backend-specific tuning parameters.
For solvent correction options, see SOLVENT_EFFECTS.md.
Command aliases:
- Short:
uma,orb,mace,aimnet2 - Prefixed:
g16-mlips-uma,g16-mlips-orb,g16-mlips-mace,g16-mlips-aimnet2
Troubleshooting
external="uma"runs the wrong plugin — Useexternal="g16-mlips-uma"to avoid alias conflicts.external="aimnet2"runs the wrong plugin — Useexternal="g16-mlips-aimnet2"to avoid alias conflicts.umacommand not found — Activate the conda environment where the package is installed.- UMA model download fails (401/403) — Run
huggingface-cli login. Some models require access approval on Hugging Face. - Works interactively but fails in PBS jobs — Use absolute path from
which umain the Gaussian input.
Citation
If you use this package, please cite:
@software{ohmura2026g16mlips,
author = {Ohmura, Takuto},
title = {g16-mlips},
year = {2026},
month = {2},
version = {1.1.0},
url = {https://github.com/t-0hmura/g16-mlips},
license = {MIT},
doi = {10.5281/zenodo.18717988}
}
References
- Gaussian External interface (official): https://gaussian.com/external/
- Gaussian External:
$g16root/g16/doc/extern.txt,$g16root/g16/doc/extgau
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file g16_mlips-1.1.1.tar.gz.
File metadata
- Download URL: g16_mlips-1.1.1.tar.gz
- Upload date:
- Size: 43.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
aaee03769883a8c20d97502ba0a166738977ef33592a8f4d509bfaf1f67fb00a
|
|
| MD5 |
c326480d07c1945e101e290f238a3c08
|
|
| BLAKE2b-256 |
31288f2543e688122e6ee935e9b27e5671fdb9db7e03df9a180f8f362f157fc5
|
Provenance
The following attestation bundles were made for g16_mlips-1.1.1.tar.gz:
Publisher:
release.yml on t-0hmura/g16-mlips
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
g16_mlips-1.1.1.tar.gz -
Subject digest:
aaee03769883a8c20d97502ba0a166738977ef33592a8f4d509bfaf1f67fb00a - Sigstore transparency entry: 1167989033
- Sigstore integration time:
-
Permalink:
t-0hmura/g16-mlips@058b4d0405ec058677345fa23f1544e01924b9c9 -
Branch / Tag:
refs/tags/v1.1.1 - Owner: https://github.com/t-0hmura
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@058b4d0405ec058677345fa23f1544e01924b9c9 -
Trigger Event:
release
-
Statement type:
File details
Details for the file g16_mlips-1.1.1-py3-none-any.whl.
File metadata
- Download URL: g16_mlips-1.1.1-py3-none-any.whl
- Upload date:
- Size: 41.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
348bbb575f671d7973b0d27087178be6c25a19b941316d944ae348152cf9868b
|
|
| MD5 |
6f63d8e1cad826d52686f4240875dff3
|
|
| BLAKE2b-256 |
254f76e7c668ccb8d800cc839cb7518b6c2d8a7d709c4aa595a99b81cfcb829d
|
Provenance
The following attestation bundles were made for g16_mlips-1.1.1-py3-none-any.whl:
Publisher:
release.yml on t-0hmura/g16-mlips
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
g16_mlips-1.1.1-py3-none-any.whl -
Subject digest:
348bbb575f671d7973b0d27087178be6c25a19b941316d944ae348152cf9868b - Sigstore transparency entry: 1167989132
- Sigstore integration time:
-
Permalink:
t-0hmura/g16-mlips@058b4d0405ec058677345fa23f1544e01924b9c9 -
Branch / Tag:
refs/tags/v1.1.1 - Owner: https://github.com/t-0hmura
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@058b4d0405ec058677345fa23f1544e01924b9c9 -
Trigger Event:
release
-
Statement type: