Pipeline Parallelism for PyTorch
Project description
The PiPPy project stands for Pipeline Parallelism for PyTorch. It consists of a compiler and runtime stack for automated parallelism and scaling of PyTorch models. PiPPy partitions the code of the model in a pipelined fashion and enables multiple micro-batches to execute different parts of the model code concurrently. For details, please visit PiPPy's GitHub page.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
No source distribution files available for this release.See tutorial on generating distribution archives.
Built Distribution
torchpippy-0.1.1-py3-none-any.whl
(381.8 kB
view details)
File details
Details for the file torchpippy-0.1.1-py3-none-any.whl
.
File metadata
- Download URL: torchpippy-0.1.1-py3-none-any.whl
- Upload date:
- Size: 381.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c9e36552bb1858face1081bbd83c14a7a5a53fd642b69e7760db88e96cdc18f8 |
|
MD5 | 2e3104abc20bb22e57751be8446eb4b7 |
|
BLAKE2b-256 | 59fe1b379453301c248669fd9274e9ae5755abd63e32142f3d92456b87d5bda8 |