Skip to main content

Install PyTorch distributions computation backend auto-detection

Project description

light-the-torch

package

License Project Status: WIP

code

black mypy Lint status via GitHub Actions

tests

Test status via GitHub Actions Test coverage via codecov.io

With each release of a PyTorch distribution (torch, torchvision, torchaudio, torchtext) the wheels are published for combinations of different Python versions, platforms, and computation backends (CPU, CUDA). Unfortunately, a differentation based on the computation backend is not supported by PEP 440 . As a workaround the computation backend is added as a local specifier. For example

torch==1.5.1+<computation backend>

Due to this restriction only the wheels of the latest CUDA release are uploaded to PyPI and thus easily pip install able. For other CUDA versions or the installation without CUDA support, one has to resort to manual version specification:

pip install -f https://download.pytorch.org/whl/torch_stable.html torch==1.5.1+<computation backend>

This is especially frustrating if one wants to install packages that depend on one or several PyTorch distributions: for each package the required PyTorch distributions have to be manually tracked down, resolved, and installed before the other requirements can be installed.

light-the-torch offers a small CLI based on pip to install the PyTorch distributions from the stable releases. Similar to the Python version and platform, the computation backend is auto-detected from the available hardware preferring CUDA over CPU.

Installation

The latest published version can be installed with

pip install light-the-torch

The latest, potentially unstable development version can be installed with

pip install git+https://github.com/pmeier/light-the-torch

Usage

light-the-torch is invoked with its shorthand ltt

$ ltt --help

usage: ltt [-h] [-V] [--computation-backend COMPUTATION_BACKEND]
           [--full-install] [--install-cmd INSTALL_CMD] [--no-install]
           [args [args ...]]

Install PyTorch distributions from the stable releases. The computation
backend is autodetected from the available hardware preferring CUDA over CPU.

positional arguments:
  args                  arguments passed to pip install. Required PyTorch
                        distributions are extracted and installed. Optional
                        arguments for pip install have to be seperated by '--'
                        (default: None)

optional arguments:
  -h, --help            show this help message and exit
  -V, --version         show version and exit (default: False)
  --computation-backend COMPUTATION_BACKEND
                        pin computation backend, e.g. 'cpu' or 'cu102'
                        (default: None)
  --full-install        install remaining requirements after PyTorch
                        distributions are installed (default: False)
  --install-cmd INSTALL_CMD
                        installation command. '{links}' is substituted for the
                        links. If present, '{opts}' is substituted for most
                        additional pip install options. Exceptions are -e /
                        --editable <path/url> and -r / --requirement <file>
                        (default: pip install {opts} {links})
  --no-install          print wheel links instead of installing (default:
                        False)

Example 1

ltt can be used to install PyTorch distributions without worrying about the computation backend:

$ ltt torch torchvision
[...]
Successfully installed future-0.18.2 numpy-1.19.0 pillow-7.2.0 torch-1.5.1+cu101 torchvision-0.6.1+cu101

Example 2

ltt extracts the required PyTorch distributions from the positional arguments:

$ ltt kornia
[...]
Successfully installed torch-1.5.0+cu101

Example 3

The --full-install option can be used as a replacement for pip install:

$ ltt --full-install kornia
[...]
Successfully installed future-0.18.2 numpy-1.19.0 torch-1.5.0+cu101
[...]
Successfully installed kornia-0.3.1

Example 4

The --no-install option can be used to pipe or redirect the PyTorch wheel links. For example, generating a requirements.txt file:

$ ltt --no-install torchaudio > requirements.txt
$ cat requirements.txt
https://download.pytorch.org/whl/cu101/torch-1.5.1%2Bcu101-cp36-cp36m-linux_x86_64.whl
https://download.pytorch.org/whl/torchaudio-0.5.1-cp36-cp36m-linux_x86_64.whl

Example 5

The --computation-backend option as well as the --platform and --python-version options from pip install can be used to disable the autodetection:

$ ltt \
  --no-install \
  --computation-backend cu92 \
  -- \
  --python-version 37 \
  --platform win_amd64 \
  torchtext
https://download.pytorch.org/whl/cu92/torch-1.5.1%2Bcu92-cp37-cp37m-win_amd64.whl
https://download.pytorch.org/whl/torchtext-0.6.0-py3-none-any.whl

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

light_the_torch-0.1.0.tar.gz (14.5 kB view details)

Uploaded Source

Built Distribution

light_the_torch-0.1.0-py3-none-any.whl (13.4 kB view details)

Uploaded Python 3

File details

Details for the file light_the_torch-0.1.0.tar.gz.

File metadata

  • Download URL: light_the_torch-0.1.0.tar.gz
  • Upload date:
  • Size: 14.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.1.0 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.6.10

File hashes

Hashes for light_the_torch-0.1.0.tar.gz
Algorithm Hash digest
SHA256 1a9c333851b6fc08eb97784311c2bc7547c93c280f14b406d646ec9d42c04f84
MD5 83fe950ed717d53bab9f0810c0340c9c
BLAKE2b-256 878522db05e6afbbee30549632dc4664c64aa62ff422646d9b3e2ea2095d86d9

See more details on using hashes here.

File details

Details for the file light_the_torch-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: light_the_torch-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 13.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.1.0 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.6.10

File hashes

Hashes for light_the_torch-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 29d5eee3932261b3e8f12875682d6cfd51079597fb9e899dd2153110d2f4f69b
MD5 7564276a08bbb60045c0d85497cf6881
BLAKE2b-256 30581baf1300bf39e5ff04f2e894ae5d6d7394facbc82bf87a5e010009c19b19

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page