Skip to main content

TorchMD-NET provides state-of-the-art neural networks potentials for biomolecular systems

Project description

Code style: black CI Documentation Status

TorchMD-NET

TorchMD-NET provides state-of-the-art neural networks potentials (NNPs) and a mechanism to train them. It offers efficient and fast implementations if several NNPs and it is integrated in GPU-accelerated molecular dynamics code like ACEMD, OpenMM and TorchMD. TorchMD-NET exposes its NNPs as PyTorch modules.

Documentation

Documentation is available at https://torchmd-net.readthedocs.io

Available architectures

Installation

TorchMD-Net is available as a pip installable wheel as well as in conda-forge

TorchMD-Net provides builds for CPU-only, CUDA 11.8 and CUDA 12.4. CPU versions are only provided as reference, as the performance will be extremely limited. Depending on which variant you wish to install, you can install it with one of the following commands:

# The following will install the CUDA 12.4 version by default
pip install torchmd-net 
# The following will install the CUDA 11.8 version
pip install torchmd-net --extra-index-url https://download.pytorch.org/whl/cu118 --extra-index-url https://us-central1-python.pkg.dev/pypi-packages-455608/cu118/simple
# The following will install the CUDA 12.4 version
pip install torchmd-net --extra-index-url https://download.pytorch.org/whl/cu124 --extra-index-url https://us-central1-python.pkg.dev/pypi-packages-455608/cu124/simple
# The following will install the CPU only version (not recommended)
pip install torchmd-net --extra-index-url https://download.pytorch.org/whl/cpu --extra-index-url https://us-central1-python.pkg.dev/pypi-packages-455608/cpu/simple   

Alternatively it can be installed with conda or mamba with one of the following commands. We recommend using Miniforge instead of anaconda.

mamba install torchmd-net cuda-version=11.8
mamba install torchmd-net cuda-version=12.4

Install from source

TorchMD-Net is installed using pip, but you will need to install some dependencies before. Check this documentation page.

Usage

Specifying training arguments can either be done via a configuration yaml file or through command line arguments directly. Several examples of architectural and training specifications for some models and datasets can be found in examples/. Note that if a parameter is present both in the yaml file and the command line, the command line version takes precedence. GPUs can be selected by setting the CUDA_VISIBLE_DEVICES environment variable. Otherwise, the argument --ngpus can be used to select the number of GPUs to train on (-1, the default, uses all available GPUs or the ones specified in CUDA_VISIBLE_DEVICES). Keep in mind that the GPU ID reported by nvidia-smi might not be the same as the one CUDA_VISIBLE_DEVICES uses.
For example, to train the Equivariant Transformer on the QM9 dataset with the architectural and training hyperparameters described in the paper, one can run:

mkdir output
CUDA_VISIBLE_DEVICES=0 torchmd-train --conf torchmd-net/examples/ET-QM9.yaml --log-dir output/

Run torchmd-train --help to see all available options and their descriptions.

Pretrained models

See here for instructions on how to load pretrained models.

Creating a new dataset

If you want to train on custom data, first have a look at torchmdnet.datasets.Custom, which provides functionalities for loading a NumPy dataset consisting of atom types and coordinates, as well as energies, forces or both as the labels. Alternatively, you can implement a custom class according to the torch-geometric way of implementing a dataset. That is, derive the Dataset or InMemoryDataset class and implement the necessary functions (more info here). The dataset must return torch-geometric Data objects, containing at least the keys z (atom types) and pos (atomic coordinates), as well as y (label), neg_dy (negative derivative of the label w.r.t atom coordinates) or both.

Custom prior models

In addition to implementing a custom dataset class, it is also possible to add a custom prior model to the model. This can be done by implementing a new prior model class in torchmdnet.priors and adding the argument --prior-model <PriorModelName>. As an example, have a look at torchmdnet.priors.Atomref.

Multi-Node Training

In order to train models on multiple nodes some environment variables have to be set, which provide all necessary information to PyTorch Lightning. In the following we provide an example bash script to start training on two machines with two GPUs each. The script has to be started once on each node. Once torchmd-train is started on all nodes, a network connection between the nodes will be established using NCCL.

In addition to the environment variables the argument --num-nodes has to be specified with the number of nodes involved during training.

export NODE_RANK=0
export MASTER_ADDR=hostname1
export MASTER_PORT=12910

mkdir -p output
CUDA_VISIBLE_DEVICES=0,1 torchmd-train --conf torchmd-net/examples/ET-QM9.yaml.yaml --num-nodes 2 --log-dir output/
  • NODE_RANK : Integer indicating the node index. Must be 0 for the main node and incremented by one for each additional node.
  • MASTER_ADDR : Hostname or IP address of the main node. The same for all involved nodes.
  • MASTER_PORT : A free network port for communication between nodes. PyTorch Lightning suggests port 12910 as a default.

Known Limitations

  • Due to the way PyTorch Lightning calculates the number of required DDP processes, all nodes must use the same number of GPUs. Otherwise training will not start or crash.
  • We observe a 50x decrease in performance when mixing nodes with different GPU architectures (tested with RTX 2080 Ti and RTX 3090).
  • Some CUDA systems might hang during a multi-GPU parallel training. Try export NCCL_P2P_DISABLE=1, which disables direct peer to peer GPU communication.

Cite

If you use TorchMD-NET in your research, please cite the following papers:

Main reference

@misc{pelaez2024torchmdnet,
title={TorchMD-Net 2.0: Fast Neural Network Potentials for Molecular Simulations}, 
author={Raul P. Pelaez and Guillem Simeon and Raimondas Galvelis and Antonio Mirarchi and Peter Eastman and Stefan Doerr and Philipp Thölke and Thomas E. Markland and Gianni De Fabritiis},
year={2024},
eprint={2402.17660},
archivePrefix={arXiv},
primaryClass={cs.LG}
}

TensorNet

@inproceedings{simeon2023tensornet,
title={TensorNet: Cartesian Tensor Representations for Efficient Learning of Molecular Potentials},
author={Guillem Simeon and Gianni De Fabritiis},
booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
year={2023},
url={https://openreview.net/forum?id=BEHlPdBZ2e}
}

Equivariant Transformer

@inproceedings{
tholke2021equivariant,
title={Equivariant Transformers for Neural Network based Molecular Potentials},
author={Philipp Th{\"o}lke and Gianni De Fabritiis},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=zNHzqZ9wrRB}
}

Graph Network

@article{Majewski2023,
  title = {Machine learning coarse-grained potentials of protein thermodynamics},
  volume = {14},
  ISSN = {2041-1723},
  url = {http://dx.doi.org/10.1038/s41467-023-41343-1},
  DOI = {10.1038/s41467-023-41343-1},
  number = {1},
  journal = {Nature Communications},
  publisher = {Springer Science and Business Media LLC},
  author = {Majewski,  Maciej and Pérez,  Adrià and Th\"{o}lke,  Philipp and Doerr,  Stefan and Charron,  Nicholas E. and Giorgino,  Toni and Husic,  Brooke E. and Clementi,  Cecilia and Noé,  Frank and De Fabritiis,  Gianni},
  year = {2023},
  month = sep 
}

Developer guide

Implementing a new architecture

To implement a new architecture, you need to follow these steps:
1. Create a new class in torchmdnet.models that inherits from torch.nn.Model. Follow TorchMD_ET as a template. This is a minimum implementation of a model:

class MyModule(nn.Module):
  def __init__(self, parameter1, parameter2):
	super(MyModule, self).__init__()
	# Define your model here
	self.layer1 = nn.Linear(10, 10)
	...
	# Initialize your model parameters here
	self.reset_parameters()

    def reset_parameters(self):
      # Initialize your model parameters here
	  nn.init.xavier_uniform_(self.layer1.weight)
	...
	
  def forward(self,
        z: Tensor, # Atomic numbers, shape (n_atoms, 1)
        pos: Tensor, # Atomic positions, shape (n_atoms, 3)
        batch: Tensor, # Batch vector, shape (n_atoms, 1). All atoms in the same molecule have the same value and are contiguous.
        q: Optional[Tensor] = None, # Atomic charges, shape (n_atoms, 1)
        s: Optional[Tensor] = None, # Atomic spins, shape (n_atoms, 1)
    ) -> Tuple[Tensor, Tensor, Tensor, Tensor, Tensor]:
	# Define your forward pass here
	scalar_features = ...
	vector_features = ...
	# Return the scalar and vector features, as well as the atomic numbers, positions and batch vector
	return scalar_features, vector_features, z, pos, batch

2. Add the model to the __all__ list in torchmdnet.models.__init__.py. This will make the tests pick your model up.
3. Tell models.model.create_model how to initialize your module by adding a new entry, for instance:

    elif args["model"] == "mymodule":
       from torchmdnet.models.torchmd_mymodule import MyModule
       is_equivariant = False # Set to True if your model is equivariant
       representation_model = MyModule(
           parameter1=args["parameter1"],
           parameter2=args["parameter2"],
           **shared_args, # Arguments typically shared by all models
       )

4. Add any new parameters required to initialize your module to scripts.train.get_args. For instance:

  parser.add_argument('--parameter1', type=int, default=32, help='Parameter1 required by MyModule')
  ...

5. Add an example configuration file to torchmd-net/examples that uses your model.
6. Make tests use your configuration file by adding a case to tests.utils.load_example_args. For instance:

if model_name == "mymodule":
       config_file = join(dirname(dirname(__file__)), "examples", "MyModule-QM9.yaml")

At this point, if your module is missing some feature the tests will let you know, and you can add it. If you add a new feature to the package, please add a test for it.

Code style

We use black. Please run black on your modified files before committing.

Testing

To run the tests, install the package and run pytest in the root directory of the repository. Tests are a good source of knowledge on how to use the different components of the package.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

torchmd_net_cpu-2.4.14-cp313-cp313-win_amd64.whl (215.2 kB view details)

Uploaded CPython 3.13Windows x86-64

torchmd_net_cpu-2.4.14-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl (4.8 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.24+ x86-64manylinux: glibc 2.28+ x86-64

torchmd_net_cpu-2.4.14-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl (4.8 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.24+ ARM64manylinux: glibc 2.28+ ARM64

torchmd_net_cpu-2.4.14-cp313-cp313-macosx_11_0_arm64.whl (241.5 kB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

torchmd_net_cpu-2.4.14-cp312-cp312-win_amd64.whl (215.1 kB view details)

Uploaded CPython 3.12Windows x86-64

torchmd_net_cpu-2.4.14-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl (4.8 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.24+ x86-64manylinux: glibc 2.28+ x86-64

torchmd_net_cpu-2.4.14-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl (4.8 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.24+ ARM64manylinux: glibc 2.28+ ARM64

torchmd_net_cpu-2.4.14-cp312-cp312-macosx_11_0_arm64.whl (241.5 kB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

torchmd_net_cpu-2.4.14-cp311-cp311-win_amd64.whl (215.2 kB view details)

Uploaded CPython 3.11Windows x86-64

torchmd_net_cpu-2.4.14-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl (4.8 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.24+ x86-64manylinux: glibc 2.28+ x86-64

torchmd_net_cpu-2.4.14-cp311-cp311-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl (4.8 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.24+ ARM64manylinux: glibc 2.28+ ARM64

torchmd_net_cpu-2.4.14-cp311-cp311-macosx_11_0_arm64.whl (241.4 kB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

torchmd_net_cpu-2.4.14-cp310-cp310-win_amd64.whl (215.1 kB view details)

Uploaded CPython 3.10Windows x86-64

torchmd_net_cpu-2.4.14-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl (4.8 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.24+ x86-64manylinux: glibc 2.28+ x86-64

torchmd_net_cpu-2.4.14-cp310-cp310-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl (4.8 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.24+ ARM64manylinux: glibc 2.28+ ARM64

torchmd_net_cpu-2.4.14-cp310-cp310-macosx_11_0_arm64.whl (241.4 kB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

torchmd_net_cpu-2.4.14-cp39-cp39-win_amd64.whl (215.1 kB view details)

Uploaded CPython 3.9Windows x86-64

torchmd_net_cpu-2.4.14-cp39-cp39-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl (4.8 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.24+ x86-64manylinux: glibc 2.28+ x86-64

torchmd_net_cpu-2.4.14-cp39-cp39-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl (4.8 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.24+ ARM64manylinux: glibc 2.28+ ARM64

torchmd_net_cpu-2.4.14-cp39-cp39-macosx_11_0_arm64.whl (241.4 kB view details)

Uploaded CPython 3.9macOS 11.0+ ARM64

File details

Details for the file torchmd_net_cpu-2.4.14-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 fc6df482503e06e7bb5565f87e495649e5c56aefa1a536a0b80b62d25460f905
MD5 7fff47b0a923e6840f01023f859db15c
BLAKE2b-256 25df7f124a47a49bdd1462a5a63f0b0e21c90b1e8eefc06fad12cafa0a228f98

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 3e15312df30169a9c9dc3b2c68f1ad5655626040c2fc59e973b9e4c0a41525bf
MD5 f3dca9f06c72062212fe3cb1db90d907
BLAKE2b-256 b07ec7427d6ced3a118b1522c54c9e953467d78e14947810f3c8fb531ecd299a

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 665566939825d620e38b9661a951d509619b3f85b93caebf6ab3946e26fd04f1
MD5 3d41c0ca4f1998e7967f4195441fc872
BLAKE2b-256 962c64cb6c81a183c2fbc96fac09fee6bf3a423ac6a73ab79d5dfe4d6dffa882

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 56489920a21a123bef8ed7572d7746bcf9ecb5990a456ddc1a47f19a7711a900
MD5 fda50dc4d5eb9c6a01cdca38d836c582
BLAKE2b-256 4c98360f01ee291eb5b9704d744cf20f78dd3ff37f9bbeb4b2c6e57cbf9bf844

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 d42e2e44d5d2c4e43e7d7a426a320dc171eddb4a0bcef722313808f52208c838
MD5 c039e5832464fd89d99b8b9c539d4299
BLAKE2b-256 1f543b08f4e74b65a89be4b1e8f0763492ae21fc28e26567c22f5498c86e15e2

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 8ad7e66dd245d3905f4ff6340ee366850888ebf1b69934ded26289755b6bc5d9
MD5 d9459f3bc45ef1fc408a4347937ad8fa
BLAKE2b-256 d063feae403a38ff9b3895e7c48d59a64b0d1e25e0a680731fa425f74bb8b4d2

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 40b50a4757c0c6eaf871e2ffeb2a09b2be507cff0f21695730a768db6fa5eaec
MD5 8800dbd032c7d194f9e896a322429e3b
BLAKE2b-256 ff7f812e2974a4066b5efb5a72d94746a30fb373cfedf9588c7c7bbb7f67887c

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 d3ff646c26d01c5b7f91242410c13e4249b0768d9dff99a9fcaf7edc15f6b658
MD5 1d3bbd14bbf7029e5fd0e96afb3c3f08
BLAKE2b-256 f761d6e91ec0454aeb4258e2c62510f6b89f85f098d27f13fdebb985412db89b

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 ca92ceb36df470b1456806af3c4b34a4a9e670b6ce0e6187229ee1ccc7c42935
MD5 910c82911d625d6245c32f82abf81044
BLAKE2b-256 4002d69975040ba2fdc7de191c75aee7a9442c3a19c566b47a471d9a74a8f115

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 78e1bd18d8cc35dba6fc004498b2a04f5e97ab54bd0a8d6e2d8e607e7940ea94
MD5 14ee9e0ad19bd7145a5ab0f943326bcc
BLAKE2b-256 426da7f6cacae00d937d4f2eb162bdd0370ff783ee59902480d78ade582fd543

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp311-cp311-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp311-cp311-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 7f3fca31f0e9ec9b8428aac9f2e46982748cad3a2ca2a164dac2f6084abdc554
MD5 f2a4cb387cebd7ee31f3bc1b7c65892a
BLAKE2b-256 f012bed783ed13b74ab95732e52caaa25411150d5bd835da8a0638360ec79fe3

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 e12b0e5e16d26d615d4b8087c9fd0fcbd404453db7a501248f9c862171b52043
MD5 80031772a677a53004936bb98b03f165
BLAKE2b-256 bdef436fd98217aaacd9ea0a54aaaeb999e5344b4ecd198c4bb99cd2cbdc6976

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 05f7c041ea2996086bb49ec47a030e6958e754ad0139f62973cceb62fedbebb0
MD5 4213e026149892f46f082f5cd53a20fd
BLAKE2b-256 d91e0b482237b05dd96f4e5d4ba01b9ca88b68850c684200ffec841ce750ebdd

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 6a575950f2f38e5dffbbc6bb155077cf712bed7189d377b938813a7499a4a9cc
MD5 560351d70739df0e884ee87249853c56
BLAKE2b-256 850dc3b3a6eee4db78e2df4392477ac9458f3808616d7de081500a3d2d5b60b5

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp310-cp310-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp310-cp310-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 c9a61cfcc5f094727870eaf01c1e7bae51ffbeb5c3a49ae1163e68387af832dc
MD5 6c284bf8179772e93f9c4557f4679a2d
BLAKE2b-256 76ff8bcb92dc7532963f1a93e2abb19d1b357188b128dcc1dd103e48bc06b606

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 4aa206ec96b5726258e3f6dad8bf45e2e8a7f2d0fbac7b71ccd156eab29d4765
MD5 18a237e4871048125243fff879670517
BLAKE2b-256 57dc2ca8d1b5bacb78494fa0cbbfc2e9890568df3c5927028bcc6720df3ab251

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 0320b5348c8d4d6fb3ade2bbd83e422745d8ca8bb56b690c5300b4d39412df28
MD5 21c3904d2c3c2f4c8e90417c16e0a8b3
BLAKE2b-256 a47d43ebd1afce544bb09f4cbcdb9b0fc07842ce66b3a2d4a434afc32c5e82fc

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp39-cp39-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp39-cp39-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 7d62d1a43a97beb17de982eef2178bcce2e5dda51b26148de094dc7e881dde9b
MD5 881e29d7c4efe86580c2c9a7821688e5
BLAKE2b-256 122a750949299567be86ee3ae9e46156b790ba9e6b0771967043db72fc09cf39

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp39-cp39-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp39-cp39-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 a34aac3b25df43b9a1ff9b8b726159f99a70699472e60997a17c753d861e56fd
MD5 ada032433c5bd3908200828c638ebeb8
BLAKE2b-256 feaf7a64f4ed073cdc150df31acade7b632e6a372c54797f0984bab3b303853d

See more details on using hashes here.

File details

Details for the file torchmd_net_cpu-2.4.14-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for torchmd_net_cpu-2.4.14-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 dda5104ce2fe3f4b215892abf5b200d661085d2a9a50cc3a67c10f4416306818
MD5 d1681ca1d5067d05417e0443756b27a0
BLAKE2b-256 5b06800227743948fd48afec141ee00268a958460d2f15541aa7efc2b303c606

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page