Skip to main content

SHARK Turbine Machine Learning Deployment Tools

Project description

IREE Turbine

image

Turbine is IREE's frontend for PyTorch.

Turbine provides a collection of tools:

  • AOT Export: For compiling one or more nn.Modules to compiled, deployment ready artifacts. This operates via both a simple one-shot export API (Already upstreamed to torch-mlir) for simple models and an underlying advanced API for complicated models and accessing the full features of the runtime.
  • Eager Execution: A torch.compile backend is provided and a Turbine Tensor/Device is available for more native, interactive use within a PyTorch session.
  • Custom Ops: Integration for defining custom PyTorch ops and implementing them in terms of IREE's backend IR or a Pythonic kernel language.

Contact Us

Turbine is under active development. Feel free to reach out on one of IREE's communication channels (specifically, we monitor the #pytorch channel on the IREE Discord server).

Quick Start for Users

  1. Install from source:
pip install iree-turbine
# Or for editable: see instructions under developers

The above does install some unecessary cuda/cudnn packages for cpu use. To avoid this you can specify pytorch-cpu and install via:

pip install -r pytorch-cpu-requirements.txt
pip install iree-turbine

(or follow the "Developers" instructions below for installing from head/nightly)

  1. Try one of the samples:

Generally, we use Turbine to produce valid, dynamic shaped Torch IR (from the torch-mlir torch dialect with various approaches to handling globals). Depending on the use-case and status of the compiler, these should be compilable via IREE with --iree-input-type=torch for end to end execution. Dynamic shape support in torch-mlir is a work in progress, and not everything works at head with release binaries at present.

Developers

Use this as a guide to get started developing the project using pinned, pre-release dependencies. You are welcome to deviate as you see fit, but these canonical directions mirror what the CI does.

Setup a venv

We recommend setting up a virtual environment (venv). The project is configured to ignore .venv directories, and editors like VSCode pick them up by default.

python -m venv --prompt iree-turbine .venv
source .venv/bin/activate

Install PyTorch for Your System

If no explicit action is taken, the default PyTorch version will be installed. This will give you a current CUDA-based version. Install a different variant by doing so explicitly first:

CPU:

pip install -r pytorch-cpu-requirements.txt

ROCM:

pip install -r pytorch-rocm-requirements.txt

Install Development Packages

# Install editable local projects.
pip install -r requirements.txt -e .

Running Tests

pytest .

Optional: Pre-commits and developer settings

This project is set up to use the pre-commit tooling. To install it in your local repo, run: pre-commit install. After this point, when making commits locally, hooks will run. See https://pre-commit.com/

Using a development compiler

If doing native development of the compiler, it can be useful to switch to source builds for iree-compiler and iree-runtime.

In order to do this, check out IREE and follow the instructions to build from source, making sure to specify additional options for the Python bindings:

-DIREE_BUILD_PYTHON_BINDINGS=ON -DPython3_EXECUTABLE="$(which python)"

Configuring Python

Uninstall existing packages:

pip uninstall iree-compiler
pip uninstall iree-runtime

Copy the .env file from iree/ to this source directory to get IDE support and add to your path for use from your shell:

source .env && export PYTHONPATH

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

shark_turbine-2.4.1-py3-none-any.whl (223.1 kB view details)

Uploaded Python 3

File details

Details for the file shark_turbine-2.4.1-py3-none-any.whl.

File metadata

File hashes

Hashes for shark_turbine-2.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7ee28f64892493b30dad1ffbf1ae17fedcf39c1b83afa3f3100325d90746ddca
MD5 21e33c229290f0b0bf1996fd3d83da63
BLAKE2b-256 3ce5aa2a22904d5ea51289c18e22f0102167b275fe74399bd05a5ea73a45c4e4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page