Skip to main content

Fully Differentiable Approach to Extended Tight Binding

Project description

Fully Differentiable Extended Tight-Binding

- Combining semi-empirical quantum chemistry with machine learning in PyTorch -

Release Apache-2.0
Test Status Ubuntu Test Status macOS (x86) Test Status macOS (ARM) Test Status Windows
Build Status Documentation Status pre-commit.ci Status Coverage
Python Versions PyTorch Versions


The xTB methods (GFNn-xTB) are a series of semi-empirical quantum chemical methods that provide a good balance between accuracy and computational cost.

With dxtb, we provide a re-implementation of the xTB methods in PyTorch, which allows for automatic differentiation and seamless integration into machine learning frameworks.

NOTE: If you encounter any bugs or have questions on how to use dxtb, feel free to open an issue.

Installation

pip PyPI Version PyPI Downloads

dxtb can easily be installed with pip.

pip install dxtb[libcint]

Installing the libcint interface is highly recommended, as it is significantly faster than the pure PyTorch implementation and provides access to higher-order multipole integrals and their derivatives. However, the interface is currently only available on Linux.

conda Conda Version Conda Downloads

dxtb is also available on conda from the conda-forge channel.

mamba install dxtb

Don't forget to install the libcint interface (not on conda) via pip install tad-libcint.

For Windows, dxtb is not available via conda, because PyTorch itself is not registered in the conda-forge channel.

Other

For more options, see the installation guide in the documentation.

Example

The following example demonstrates how to compute the energy and forces using GFN1-xTB.

import torch
import dxtb

dd = {"dtype": torch.double, "device": torch.device("cpu")}

# LiH
numbers = torch.tensor([3, 1], device=dd["device"])
positions = torch.tensor([[0.0, 0.0, 0.0], [0.0, 0.0, 1.5]], **dd)

# instantiate a calculator
calc = dxtb.calculators.GFN1Calculator(numbers, **dd)

# compute the energy
pos = positions.clone().requires_grad_(True)
energy = calc.get_energy(pos)

# obtain gradient (dE/dR) via autograd
(g,) = torch.autograd.grad(energy, pos)

# Alternatively, forces can directly be requested from the calculator.
# (Don't forget to manually reset the calculator when the inputs are identical.)
calc.reset()
pos = positions.clone().requires_grad_(True)
forces = calc.get_forces(pos)

assert torch.equal(forces, -g)

All quantities are in atomic units.

For more examples and details, check out the documentation.

Compatibility

PyTorch \ Python 3.8 3.9 3.10 3.11 3.12
1.11.0 :white_check_mark: :white_check_mark: :x: :x: :x:
1.12.1 :white_check_mark: :white_check_mark: :white_check_mark: :x: :x:
1.13.1 :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :x:
2.0.1 :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :x:
2.1.2 :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :x:
2.2.2 :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark:
2.3.1 :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark:
2.4.1 :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark: :white_check_mark:

Note that only the latest bug fix version is listed, but all preceding bug fix minor versions are supported. For example, although only version 2.2.2 is listed, version 2.2.0 and 2.2.1 are also supported.

Restriction for macOS and Windows:

On macOS and Windows, PyTorch<2.0.0 does only support Python<3.11.

The libcint interface is not available for macOS and Windows. Correspondingly, the integral evaluation can be considerably slower. Moreover, higher-order multipole integrals (dipole, quadrupole, ...) are not implemented. While macOS support may be considered in the future, native Windows support is not possible, because the underlying libcint library does not work under Windows.

Citation

If you use dxtb in your research, please cite the following paper:

  • M. Friede, C. Hölzer, S. Ehlert, S. Grimme, dxtb -- An Efficient and Fully Differentiable Framework for Extended Tight-Binding, J. Chem. Phys., 2024, 161, 062501. (DOI)

The Supporting Information can be found here.

For details on the xTB methods, see

  • C. Bannwarth, E. Caldeweyher, S. Ehlert, A. Hansen, P. Pracht, J. Seibert, S. Spicher, S. Grimme, WIREs Comput. Mol. Sci., 2020, 11, e01493. (DOI)
  • C. Bannwarth, S. Ehlert, S. Grimme, J. Chem. Theory Comput., 2019, 15, 1652-1671. (DOI)
  • S. Grimme, C. Bannwarth, P. Shushkov, J. Chem. Theory Comput., 2017, 13, 1989-2009. (DOI)

Contributing

This is a volunteer open source projects and contributions are always welcome. Please, take a moment to read the contributing guidelines.

License

This project is licensed under the Apache License, Version 2.0 (the "License"); you may not use this project's files except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dxtb-0.1.1.tar.gz (317.7 kB view details)

Uploaded Source

Built Distribution

dxtb-0.1.1-py3-none-any.whl (508.6 kB view details)

Uploaded Python 3

File details

Details for the file dxtb-0.1.1.tar.gz.

File metadata

  • Download URL: dxtb-0.1.1.tar.gz
  • Upload date:
  • Size: 317.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for dxtb-0.1.1.tar.gz
Algorithm Hash digest
SHA256 f578b2fb9b9ded19c10770ac9c386d5c3afef79f5c3bc403464e99b708d8fff8
MD5 6074b9d2e443f5d6ecfd8d3a3a2e8773
BLAKE2b-256 a8b94b83a455b73be9d54e8454f85af8c61d313b18a51357b81c04354599697a

See more details on using hashes here.

File details

Details for the file dxtb-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: dxtb-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 508.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for dxtb-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1cfbbfc7812903a2cb89849677eb681cf2cad87d4e5351b51e28193ae3c84bab
MD5 e4f6baf07de9206f6fafd9632549ac57
BLAKE2b-256 fd130f1aac4abc05144e8ede4afd3431f6816d1d00b540d0f12d121a6dea1936

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page