Skip to main content

Install PyTorch distributions computation backend auto-detection

Project description

light-the-torch

package

License Project Status: WIP

code

black mypy Lint status via GitHub Actions

tests

Test status via GitHub Actions Test coverage via codecov.io

With each release of a PyTorch distribution (torch, torchvision, torchaudio, torchtext) the wheels are published for combinations of different Python versions, platforms, and computation backends (CPU, CUDA). Unfortunately, a differentation based on the computation backend is not supported by PEP 440 . As a workaround the computation backend is added as a local specifier. For example

torch==1.5.1+<computation backend>

Due to this restriction only the wheels of the latest CUDA release are uploaded to PyPI and thus easily pip install able. For other CUDA versions or the installation without CUDA support, one has to resort to manual version specification:

pip install -f https://download.pytorch.org/whl/torch_stable.html torch==1.5.1+<computation backend>

This is especially frustrating if one wants to install packages that depend on one or several PyTorch distributions: for each package the required PyTorch distributions have to be manually tracked down, resolved, and installed before the other requirements can be installed.

light-the-torch offers a small CLI based on pip to install the PyTorch distributions from the stable releases. Similar to the Python version and platform, the computation backend is auto-detected from the available hardware preferring CUDA over CPU.

Installation

The latest published version can be installed with

pip install light-the-torch

The latest, potentially unstable development version can be installed with

pip install git+https://github.com/pmeier/light-the-torch

Usage

light-the-torch is invoked with its shorthand ltt

$ ltt --help

usage: ltt [-h] [-V] [--computation-backend COMPUTATION_BACKEND]
           [--full-install] [--install-cmd INSTALL_CMD] [--no-install]
           [args [args ...]]

Install PyTorch distributions from the stable releases. The computation
backend is autodetected from the available hardware preferring CUDA over CPU.

positional arguments:
  args                  arguments passed to pip install. Required PyTorch
                        distributions are extracted and installed. Optional
                        arguments for pip install have to be seperated by '--'
                        (default: None)

optional arguments:
  -h, --help            show this help message and exit
  -V, --version         show version and exit (default: False)
  --computation-backend COMPUTATION_BACKEND
                        pin computation backend, e.g. 'cpu' or 'cu102'
                        (default: None)
  --full-install        install remaining requirements after PyTorch
                        distributions are installed (default: False)
  --install-cmd INSTALL_CMD
                        installation command. '{links}' is substituted for the
                        links. If present, '{opts}' is substituted for most
                        additional pip install options. Exceptions are -e /
                        --editable <path/url> and -r / --requirement <file>
                        (default: pip install {opts} {links})
  --no-install          print wheel links instead of installing (default:
                        False)

Example 1

ltt can be used to install PyTorch distributions without worrying about the computation backend:

$ ltt torch torchvision
[...]
Successfully installed future-0.18.2 numpy-1.19.0 pillow-7.2.0 torch-1.5.1+cu101 torchvision-0.6.1+cu101

Example 2

ltt extracts the required PyTorch distributions from the positional arguments:

$ ltt kornia
[...]
Successfully installed torch-1.5.0+cu101

Example 3

The --full-install option can be used as a replacement for pip install:

$ ltt --full-install kornia
[...]
Successfully installed future-0.18.2 numpy-1.19.0 torch-1.5.0+cu101
[...]
Successfully installed kornia-0.3.1

Example 4

The --no-install option can be used to pipe or redirect the PyTorch wheel links. For example, generating a requirements.txt file:

$ ltt --no-install torchaudio > requirements.txt
$ cat requirements.txt
https://download.pytorch.org/whl/cu101/torch-1.5.1%2Bcu101-cp36-cp36m-linux_x86_64.whl
https://download.pytorch.org/whl/torchaudio-0.5.1-cp36-cp36m-linux_x86_64.whl

Example 5

The --computation-backend option as well as the --platform and --python-version options from pip install can be used to disable the autodetection:

$ ltt \
  --no-install \
  --computation-backend cu92 \
  -- \
  --python-version 37 \
  --platform win_amd64 \
  torchtext
https://download.pytorch.org/whl/cu92/torch-1.5.1%2Bcu92-cp37-cp37m-win_amd64.whl
https://download.pytorch.org/whl/torchtext-0.6.0-py3-none-any.whl

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

light_the_torch-0.1.1.tar.gz (14.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

light_the_torch-0.1.1-py3-none-any.whl (13.6 kB view details)

Uploaded Python 3

File details

Details for the file light_the_torch-0.1.1.tar.gz.

File metadata

  • Download URL: light_the_torch-0.1.1.tar.gz
  • Upload date:
  • Size: 14.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.6.10

File hashes

Hashes for light_the_torch-0.1.1.tar.gz
Algorithm Hash digest
SHA256 0731f51fdccb64fe9b3c9ce11334e5c07d7e695382e4f56ffb7a253a6c625624
MD5 a915d38399fe3ceccf0596015d08ffd5
BLAKE2b-256 2d509bc6191aa6c76b7cc7c9ac2b81d91309fe9274184f20bc3d4d21847e09d2

See more details on using hashes here.

File details

Details for the file light_the_torch-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: light_the_torch-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 13.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.6.10

File hashes

Hashes for light_the_torch-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 037e845e5367715ce55f6d7c355fb768c06a0cc6a792f51cae6269059b8165d6
MD5 1f1bbc2af72cabe679185ff9d3ad611c
BLAKE2b-256 6f15815bce2cc00201dc76c30e8ef67c2b3978b4e77c2078413964d9692bcddc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page