Skip to main content

Install PyTorch distributions computation backend auto-detection

Project description

light-the-torch

package

License Project Status: WIP

code

black mypy Lint status via GitHub Actions

tests

Test status via GitHub Actions Test coverage via codecov.io

light-the-torch offers a small CLI (and tox plugin) based on pip to install PyTorch distributions from the stable releases. Similar to the platform and Python version, the computation backend is auto-detected from the available hardware preferring CUDA over CPU.

Motivation

With each release of a PyTorch distribution (torch, torchvision, torchaudio, torchtext) the wheels are published for combinations of different computation backends (CPU, CUDA), platforms, and Python versions. Unfortunately, a differentation based on the computation backend is not supported by PEP 440 . As a workaround the computation backend is added as a local specifier. For example

torch==1.5.1+cpu

Due to this restriction only the wheels of the latest CUDA release are uploaded to PyPI and thus easily pip install able. For other CUDA versions or the installation without CUDA support, one has to resort to manual version specification:

pip install -f https://download.pytorch.org/whl/torch_stable.html torch==1.5.1+cu101

This is especially frustrating if one wants to install packages that depend on one or several PyTorch distributions: for each package the required PyTorch distributions have to be manually tracked down, resolved, and installed before the other requirements can be installed.

light-the-torch was developed to overcome this.

Installation

The latest published version can be installed with

pip install light-the-torch

The latest, potentially unstable development version can be installed with

pip install git+https://github.com/pmeier/light-the-torch

Usage

CLI

The CLI of light-the-torch is invoked with its shorthand ltt

$ ltt --help
usage: ltt [-h] [-V] {install,extract,find} ...

optional arguments:
  -h, --help            show this help message and exit
  -V, --version         show light-the-torch version and path and exit

subcommands:
  {install,extract,find}

ltt install

$ ltt install --help
usage: ltt install [-h] [--force-cpu] [--pytorch-only]
                   [--install-cmd INSTALL_CMD] [--verbose]
                   [args [args ...]]

Install PyTorch distributions from the stable releases. The computation
backend is auto-detected from the available hardware preferring CUDA over CPU.

positional arguments:
  args                  arguments of 'pip install'. Optional arguments have to
                        be seperated by '--'

optional arguments:
  -h, --help            show this help message and exit
  --force-cpu           disable computation backend auto-detection and use CPU
                        instead
  --pytorch-only        install only PyTorch distributions
  --install-cmd INSTALL_CMD
                        installation command for the PyTorch distributions and
                        additional packages. Defaults to 'python -m pip
                        install {packages}'
  --verbose             print more output to STDOUT. For fine control use -v /
                        --verbose and -q / --quiet of the 'pip install'
                        options

ltt install is a drop-in replacement for pip install without worrying about the computation backend:

$ ltt install torch torchvision
[...]
Successfully installed future-0.18.2 numpy-1.19.0 pillow-7.2.0 torch-1.5.1+cu101 torchvision-0.6.1+cu101
[...]

ltt install is also able to handle packages that depend on PyTorch distributions:

$ ltt install kornia
[...]
Successfully installed future-0.18.2 numpy-1.19.0 torch-1.5.0+cu101
[...]
Successfully installed kornia-0.3.1

ltt extract

$ ltt extract --help
usage: ltt extract [-h] [--verbose] [args [args ...]]

Extract required PyTorch distributions

positional arguments:
  args        arguments of 'pip install'. Optional arguments have to be
              seperated by '--'

optional arguments:
  -h, --help  show this help message and exit
  --verbose   print more output to STDOUT. For fine control use -v / --verbose
              and -q / --quiet of the 'pip install' options

ltt extract extracts the required PyTorch distributions out of packages:

$ ltt extract kornia
torch==1.5.0

ltt find

$ ltt find --help
usage: ltt find [-h] [--computation-backend COMPUTATION_BACKEND]
                [--platform PLATFORM] [--python-version PYTHON_VERSION]
                [--verbose]
                [args [args ...]]

Find wheel links for the required PyTorch distributions

positional arguments:
  args                  arguments of 'pip install'. Optional arguments have to
                        be seperated by '--'

optional arguments:
  -h, --help            show this help message and exit
  --computation-backend COMPUTATION_BACKEND
                        Only use wheels compatible with COMPUTATION_BACKEND,
                        for example 'cu102' or 'cpu'. Defaults to the
                        computation backend of the running system, preferring
                        CUDA over CPU.
  --platform PLATFORM   Only use wheels compatible with <platform>. Defaults
                        to the platform of the running system.
  --python-version PYTHON_VERSION
                        The Python interpreter version to use for wheel and
                        "Requires-Python" compatibility checks. Defaults to a
                        version derived from the running interpreter. The
                        version can be specified using up to three dot-
                        separated integers (e.g. "3" for 3.0.0, "3.7" for
                        3.7.0, or "3.7.3"). A major-minor version can also be
                        given as a string without dots (e.g. "37" for 3.7.0).
  --verbose             print more output to STDOUT. For fine control use -v /
                        --verbose and -q / --quiet of the 'pip install'
                        options

ltt find finds the links to the wheels of the required PyTorch distributions:

$ ltt find torchaudio > requirements.txt
$ cat requirements.txt
https://download.pytorch.org/whl/cu101/torch-1.5.1%2Bcu101-cp36-cp36m-linux_x86_64.whl
https://download.pytorch.org/whl/torchaudio-0.5.1-cp36-cp36m-linux_x86_64.whl

The --computation-backend, --platform, and python-version options can be used pin wheel properties instead of auto-detecting them:

$ ltt find \
  --computation-backend cu92 \
  --platform win_amd64 \
  --python-version 3.7 \
  torchtext
https://download.pytorch.org/whl/cu92/torch-1.5.1%2Bcu92-cp37-cp37m-win_amd64.whl
https://download.pytorch.org/whl/torchtext-0.6.0-py3-none-any.whl

Python

light-the-torch exposes two functions that can be used from Python:

import light_the_torch as ltt
help(ltt.extract_dists)
Help on function extract_dists in module light_the_torch._pip.extract:

extract_dists(pip_install_args:List[str], verbose:bool=False) -> List[str]
    Extract direct or indirect required PyTorch distributions.

    Args:
        pip_install_args: Arguments passed to ``pip install`` that will be searched for
            required PyTorch distributions
        verbose: If ``True``, print additional information to STDOUT.

    Returns:
        Resolved required PyTorch distributions.
import light_the_torch as ltt
help(ltt.find_links)
Help on function find_links in module light_the_torch._pip.find:

find_links(pip_install_args:List[str], computation_backend:Union[str, light_the_torch.computation_backend.ComputationBackend, NoneType]=None, platform:Union[str, NoneType]=None, python_version:Union[str, NoneType]=None, verbose:bool=False) -> List[str]
    Find wheel links for direct or indirect PyTorch distributions with given
    properties.

    Args:
        pip_install_args: Arguments passed to ``pip install`` that will be searched for
            required PyTorch distributions
        computation_backend: Computation backend, for example ``"cpu"`` or ``"cu102"``.
            Defaults to the available hardware of the running system preferring CUDA
            over CPU.
        platform: Platform, for example ``"linux_x86_64"`` or ``"win_amd64"``. Defaults
            to the platform of the running system.
        python_version: Python version, for example ``"3"`` or ``"3.7"``. Defaults to
            the version of the running interpreter.
        verbose: If ``True``, print additional information to STDOUT.

    Returns:
        Wheel links with given properties for all required PyTorch distributions.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

light_the_torch-0.2.1.tar.gz (18.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

light_the_torch-0.2.1-py3-none-any.whl (17.2 kB view details)

Uploaded Python 3

File details

Details for the file light_the_torch-0.2.1.tar.gz.

File metadata

  • Download URL: light_the_torch-0.2.1.tar.gz
  • Upload date:
  • Size: 18.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.6.12

File hashes

Hashes for light_the_torch-0.2.1.tar.gz
Algorithm Hash digest
SHA256 ea248f3338b1fbc522feefba02d15976bc436b988e5335f55153af0f20fb4e8a
MD5 e70939a81a3b0d8e433a99c654938232
BLAKE2b-256 9e3f705a5aa58680a21d01273ba893940b33cc1909f3766e81869183b0e9fc27

See more details on using hashes here.

File details

Details for the file light_the_torch-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: light_the_torch-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 17.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.6.12

File hashes

Hashes for light_the_torch-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 57db2d7d8f42a80e0177dd1c7aad1bb01e2d7eca87422a4bd45b040b9629f3b1
MD5 d03453bd7e440c3f3f194eed6f815cc3
BLAKE2b-256 f74185d7f00f3cda4c54c7bda11720b66ea6a9a7e87144ce149a1a908cb66779

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page