Skip to main content

Install PyTorch distributions computation backend auto-detection

Project description

light-the-torch

package

License Project Status: WIP

code

black mypy Lint status via GitHub Actions

tests

Test status via GitHub Actions Test coverage via codecov.io

With each release of a PyTorch distribution (torch, torchvision, torchaudio, torchtext) the wheels are published for combinations of different Python versions, platforms, and computation backends (CPU, CUDA). Unfortunately, a differentation based on the computation backend is not supported by PEP 440 . As a workaround the computation backend is added as a local specifier. For example

torch==1.5.1+<computation backend>

Due to this restriction only the wheels of the latest CUDA release are uploaded to PyPI and thus easily pip install able. For other CUDA versions or the installation without CUDA support, one has to resort to manual version specification:

pip install -f https://download.pytorch.org/whl/torch_stable.html torch==1.5.1+<computation backend>

This is especially frustrating if one wants to install packages that depend on one or several PyTorch distributions: for each package the required PyTorch distributions have to be manually tracked down, resolved, and installed before the other requirements can be installed.

light-the-torch offers a small CLI based on pip to install the PyTorch distributions from the stable releases. Similar to the Python version and platform, the computation backend is auto-detected from the available hardware preferring CUDA over CPU.

Installation

The latest published version can be installed with

pip install light-the-torch

The latest, potentially unstable development version can be installed with

pip install git+https://github.com/pmeier/light-the-torch

Usage

light-the-torch is invoked with its shorthand ltt

$ ltt --help

usage: ltt [-h] [-V] [--computation-backend COMPUTATION_BACKEND]
           [--full-install] [--install-cmd INSTALL_CMD] [--no-install]
           [args [args ...]]

Install PyTorch distributions from the stable releases. The computation
backend is autodetected from the available hardware preferring CUDA over CPU.

positional arguments:
  args                  arguments passed to pip install. Required PyTorch
                        distributions are extracted and installed. Optional
                        arguments for pip install have to be seperated by '--'
                        (default: None)

optional arguments:
  -h, --help            show this help message and exit
  -V, --version         show version and exit (default: False)
  --computation-backend COMPUTATION_BACKEND
                        pin computation backend, e.g. 'cpu' or 'cu102'
                        (default: None)
  --full-install        install remaining requirements after PyTorch
                        distributions are installed (default: False)
  --install-cmd INSTALL_CMD
                        installation command. '{links}' is substituted for the
                        links. If present, '{opts}' is substituted for most
                        additional pip install options. Exceptions are -e /
                        --editable <path/url> and -r / --requirement <file>
                        (default: pip install {opts} {links})
  --no-install          print wheel links instead of installing (default:
                        False)

Example 1

ltt can be used to install PyTorch distributions without worrying about the computation backend:

$ ltt torch torchvision
[...]
Successfully installed future-0.18.2 numpy-1.19.0 pillow-7.2.0 torch-1.5.1+cu101 torchvision-0.6.1+cu101

Example 2

ltt extracts the required PyTorch distributions from the positional arguments:

$ ltt kornia
[...]
Successfully installed torch-1.5.0+cu101

Example 3

The --full-install option can be used as a replacement for pip install:

$ ltt --full-install kornia
[...]
Successfully installed future-0.18.2 numpy-1.19.0 torch-1.5.0+cu101
[...]
Successfully installed kornia-0.3.1

Example 4

The --no-install option can be used to pipe or redirect the PyTorch wheel links. For example, generating a requirements.txt file:

$ ltt --no-install torchaudio > requirements.txt
$ cat requirements.txt
https://download.pytorch.org/whl/cu101/torch-1.5.1%2Bcu101-cp36-cp36m-linux_x86_64.whl
https://download.pytorch.org/whl/torchaudio-0.5.1-cp36-cp36m-linux_x86_64.whl

Example 5

The --computation-backend option as well as the --platform and --python-version options from pip install can be used to disable the autodetection:

$ ltt \
  --no-install \
  --computation-backend cu92 \
  -- \
  --python-version 37 \
  --platform win_amd64 \
  torchtext
https://download.pytorch.org/whl/cu92/torch-1.5.1%2Bcu92-cp37-cp37m-win_amd64.whl
https://download.pytorch.org/whl/torchtext-0.6.0-py3-none-any.whl

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

light_the_torch-0.1.1.tar.gz (14.6 kB view hashes)

Uploaded Source

Built Distribution

light_the_torch-0.1.1-py3-none-any.whl (13.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page