Pipeline Parallelism for PyTorch
Project description
The PiPPy project stands for Pipeline Parallelism for PyTorch. It consists of a compiler and runtime stack for automated parallelism and scaling of PyTorch models. PiPPy partitions the code of the model in a pipelined fashion and enables multiple micro-batches to execute different parts of the model code concurrently. For details, please visit PiPPy's GitHub page.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file torchpippy-0.2.0-py3-none-any.whl.
File metadata
- Download URL: torchpippy-0.2.0-py3-none-any.whl
- Upload date:
- Size: 79.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1908beff329a3d9af99e9fa8e1aecbaa110d16cba2d337bebbafee23c8ffc7c8
|
|
| MD5 |
5b09fd4141fe9b75316e0695298896f6
|
|
| BLAKE2b-256 |
2a2b794973e7d5619aa934b2a5bbf657fc9686030e7f0311ffe582c4bf64b1ed
|