Alpa automatically parallelizes large tensor computation graphs and runs them on a distributed cluster.
Project description
Alpa
Alpa is a system for training large-scale neural networks. Scaling neural networks to hundreds of billions of parameters has enabled dramatic breakthroughs such as GPT-3, but training these large-scale neural networks requires complicated distributed training techniques. Alpa aims to automate large-scale distributed training with just a few lines of code.
The key features of Alpa include:
💻 Automatic Parallelization. Alpa automatically parallelizes users' single-device code on distributed clusters with data, operator, and pipeline parallelism.
🚀 Excellent Performance. Alpa achieves linear scaling on training models with billions of parameters on distributed clusters.
✨ Tight Integration with Machine Learning Ecosystem. Alpa is backed by open-source, high-performance, and production-ready libraries such as Jax, XLA, and Ray
Quick Start
Use Alpa's decorator @parallelize
to scale your single-device training code to distributed clusters.
import alpa
# Parallelize the training step in Jax by simply using a decorator
@alpa.parallelize
def train_step(model_state, batch):
def loss_func(params):
out = model_state.forward(params, batch["x"])
return jnp.mean((out - batch["y"]) ** 2)
grads = grad(loss_func)(model_state.params)
new_model_state = model_state.apply_gradient(grads)
return new_model_state
# The training loop now automatically runs on your designated cluster
model_state = create_train_state()
for batch in data_loader:
model_state = train_step(model_state, batch)
Check out the Alpa Documentation site for installation instructions, tutorials, examples, and more.
More Information
- Alpa paper (OSDI'22)
- Google AI Blog
- Alpa talk slides
Getting Involved
- Please read the contributor guide if you are interested in contributing to Alpa.
- Please connect to Alpa contributors via the Alpa slack.
License
Alpa is licensed under the Apache-2.0 license.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distributions
Hashes for alpa-0.1.0-cp39-cp39-manylinux2014_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2e9cc45944202c0e39e784330931aa98adfcb3176472674d8f18ae23a7e9e40c |
|
MD5 | ce3f73b9f53c2cdba9e049e98b596eff |
|
BLAKE2b-256 | 70edffcb66ac26a0355f1e9fa567657b8772574b3fdc25269ca767f5358936b0 |
Hashes for alpa-0.1.0-cp38-cp38-manylinux2014_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4d47e1894545670f674c8b4370403b53009c14bc312f838151920306d5006141 |
|
MD5 | c500b3b2a76121c0d8288e06584e7cb5 |
|
BLAKE2b-256 | 391e76164a95825816589ef88c780f55a5d3053d19488135b541537f504ac3c7 |
Hashes for alpa-0.1.0-cp37-cp37m-manylinux2014_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ff66900c9806054fc0f6c10e9386a276f11420eb9102ac504b3dc77caf457e7e |
|
MD5 | 5ef1a9f7e3cf37202748342673ed1a55 |
|
BLAKE2b-256 | f35aef55dd2f0607864989364be1dc1261d10d67e9d5fd092aae317e402fc9eb |