Alpa automatically parallelizes large tensor computation graphs and runs them on a distributed cluster.
Project description
Alpa
Alpa is a system for training large-scale neural networks. Scaling neural networks to hundreds of billions of parameters has enabled dramatic breakthroughs such as GPT-3, but training these large-scale neural networks requires complicated distributed training techniques. Alpa aims to automate large-scale distributed training with just a few lines of code.
The key features of Alpa include:
💻 Automatic Parallelization. Alpa automatically parallelizes users' single-device code on distributed clusters with data, operator, and pipeline parallelism.
🚀 Excellent Performance. Alpa achieves linear scaling on training models with billions of parameters on distributed clusters.
✨ Tight Integration with Machine Learning Ecosystem. Alpa is backed by open-source, high-performance, and production-ready libraries such as Jax, XLA, and Ray
Quick Start
Use Alpa's decorator @parallelize
to scale your single-device training code to distributed clusters.
import alpa
# Parallelize the training step in Jax by simply using a decorator
@alpa.parallelize
def train_step(model_state, batch):
def loss_func(params):
out = model_state.forward(params, batch["x"])
return jnp.mean((out - batch["y"]) ** 2)
grads = grad(loss_func)(model_state.params)
new_model_state = model_state.apply_gradient(grads)
return new_model_state
# The training loop now automatically runs on your designated cluster
model_state = create_train_state()
for batch in data_loader:
model_state = train_step(model_state, batch)
Check out the Alpa Documentation site for installation instructions, tutorials, examples, and more.
More Information
- Alpa paper (OSDI'22)
- Google AI Blog
- Alpa talk slides
Getting Involved
- Please read the contributor guide if you are interested in contributing to Alpa.
- Please connect to Alpa contributors via the Alpa slack.
License
Alpa is licensed under the Apache-2.0 license.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distributions
Hashes for alpa-0.1.1-cp39-cp39-manylinux2014_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c77db43ed94a29871fc68243f6b97ef398072e7f43b31ebc2f24a13603b7936b |
|
MD5 | d1115e39e524d4f2144d838dec85ce85 |
|
BLAKE2b-256 | cbc632e37c023714010be3e45fc553bbcd8ed415be59755bf2d40d6ffbb82f45 |
Hashes for alpa-0.1.1-cp38-cp38-manylinux2014_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | be35557dd8a70ce890f8d703910482b681a4663cc2e4dd5ffd665fd8568bd778 |
|
MD5 | 0883a6bb8ceb42aae4c56b7dfff58a06 |
|
BLAKE2b-256 | 67d329f1e9c9cc65384923df8edba8a0421a6f7ad3deee9e2713cc112e9b2b95 |
Hashes for alpa-0.1.1-cp37-cp37m-manylinux2014_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 939056c0fb9a7fb019e33ceea7c6cd4271adf81ff0352019d4294ec99a1ed8c4 |
|
MD5 | 3c2d7a87f77df42a582a2521700d5139 |
|
BLAKE2b-256 | fb7190921ed6be1408a34ce02f6da99a70f948eed8989a5a5e334d3730d01718 |