Skip to main content

Fully Differentiable Approach to Extended Tight Binding

Project description

Fully Differentiable Extended Tight-Binding

- Combining semi-empirical quantum chemistry with machine learning in PyTorch -

Release Apache-2.0
Test Status Ubuntu Test Status macOS (ARM) Test Status Windows
Build Status Documentation Status pre-commit.ci Status Coverage
Python Versions PyTorch Versions


The xTB methods (GFNn-xTB) are a series of semi-empirical quantum chemical methods that provide a good balance between accuracy and computational cost.

With dxtb, we provide a re-implementation of the xTB methods in PyTorch, which allows for automatic differentiation and seamless integration into machine learning frameworks.

NOTE: If you encounter any bugs or have questions on how to use dxtb, feel free to open an issue.

Installation

pip PyPI Version PyPI Downloads

dxtb can easily be installed with pip.

pip install dxtb[libcint]

Installing the libcint interface is highly recommended, as it is significantly faster than the pure PyTorch implementation and provides access to higher-order multipole integrals and their derivatives (required for GFN2-xTB). However, the interface is currently only available on Linux.

conda Conda Version Conda Downloads

dxtb is also available on conda from the conda-forge channel.

mamba install dxtb

Don't forget to install the libcint interface (not on conda) via pip install tad-libcint. The libcint interface is required for GFN2-xTB.

For Windows, dxtb is not available via conda, because PyTorch itself is not registered in the conda-forge channel.

Other

For more options, see the installation guide in the documentation.

Example

The following example demonstrates how to compute the energy and forces using GFN1-xTB.

import torch
import dxtb

dd = {"dtype": torch.double, "device": torch.device("cpu")}

# LiH
numbers = torch.tensor([3, 1], device=dd["device"])
positions = torch.tensor([[0.0, 0.0, 0.0], [0.0, 0.0, 1.5]], **dd)

# instantiate a calculator
calc = dxtb.calculators.GFN1Calculator(numbers, **dd)

# compute the energy
pos = positions.clone().requires_grad_(True)
energy = calc.get_energy(pos)

# obtain gradient (dE/dR) via autograd
(g,) = torch.autograd.grad(energy, pos)

# Alternatively, forces can directly be requested from the calculator.
# (Don't forget to manually reset the calculator when the inputs are identical.)
calc.reset()
pos = positions.clone().requires_grad_(True)
forces = calc.get_forces(pos)

assert torch.equal(forces, -g)

All quantities are in atomic units.

For more examples and details, check out the documentation.

Compatibility

PyTorch \ Python 3.8 3.9 3.10 3.11 3.12 3.13 3.14
1.11.0 :white_check_mark: :white_check_mark: :x: :x: :x: :x: :x:
1.12.1 :white_check_mark: :white_check_mark: :white_check_mark: :x: :x: :x: :x:
1.13.1 :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :x: :x: :x:
2.0.1 :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :x: :x: :x:
2.1.2 :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :x: :x: :x:
2.2.2 :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :x: :x:
2.3.1 :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :x: :x:
2.4.1 :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :x: :x:
2.5.1 :x: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :x: :x:
2.6.0 :x: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :x:
2.7.1 :x: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :x:
2.8.0 :x: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :x:
2.9.1 :x: :x: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark:
2.10.0 :x: :x: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark:

Note that only the latest bug fix version is listed, but all preceding bug fix minor versions are supported. For example, although only version 2.2.2 is listed, version 2.2.0 and 2.2.1 are also supported.

Restriction for macOS and Windows:

On macOS and Windows, PyTorch<2.0.0 does only support Python<3.11.

The libcint interface is not available for macOS and Windows. Correspondingly, the integral evaluation can be considerably slower. Moreover, higher-order multipole integrals (dipole, quadrupole, ...) are not implemented. While macOS support may be considered in the future, native Windows support is not possible, because the underlying libcint library does not work under Windows.

Citation

If you use dxtb in your research, please cite the following paper:

  • M. Friede, C. Hölzer, S. Ehlert, S. Grimme, dxtb -- An Efficient and Fully Differentiable Framework for Extended Tight-Binding, J. Chem. Phys., 2024, 161, 062501. (DOI)

The Supporting Information can be found here.

For details on the xTB methods, see

  • C. Bannwarth, E. Caldeweyher, S. Ehlert, A. Hansen, P. Pracht, J. Seibert, S. Spicher, S. Grimme, WIREs Comput. Mol. Sci., 2020, 11, e01493. (DOI)
  • C. Bannwarth, S. Ehlert, S. Grimme, J. Chem. Theory Comput., 2019, 15, 1652-1671. (DOI)
  • S. Grimme, C. Bannwarth, P. Shushkov, J. Chem. Theory Comput., 2017, 13, 1989-2009. (DOI)

Contributing

This is a volunteer open source projects and contributions are always welcome. Please, take a moment to read the contributing guidelines.

License

This project is licensed under the Apache License, Version 2.0 (the "License"); you may not use this project's files except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dxtb-0.4.0.tar.gz (342.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dxtb-0.4.0-py3-none-any.whl (544.4 kB view details)

Uploaded Python 3

File details

Details for the file dxtb-0.4.0.tar.gz.

File metadata

  • Download URL: dxtb-0.4.0.tar.gz
  • Upload date:
  • Size: 342.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for dxtb-0.4.0.tar.gz
Algorithm Hash digest
SHA256 80bba3b3c5c779297d8be3390afde962a1a0ac3971028eaeb5bb5a9b05f6e227
MD5 26e872983c519e4c02ad92233543465e
BLAKE2b-256 07a86e593da5e2501ca6d4b94bdcf280f62993d433476c73c5772867b87452a2

See more details on using hashes here.

Provenance

The following attestation bundles were made for dxtb-0.4.0.tar.gz:

Publisher: release.yaml on grimme-lab/dxtb

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file dxtb-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: dxtb-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 544.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for dxtb-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5c5099c7fffc30715da7441f9141cd7e560c1fd8a951805b1550aac08d7564a5
MD5 5ce86a71e597b6328ec71739655a42e6
BLAKE2b-256 13c35a31911e65e86f1033b8d3cfd195b6b2f9a349da71f2d6b6f506343db61e

See more details on using hashes here.

Provenance

The following attestation bundles were made for dxtb-0.4.0-py3-none-any.whl:

Publisher: release.yaml on grimme-lab/dxtb

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page