No project description provided
Project description
Red Coast (redco) is a lightweight and user-friendly tool designed to automate distributed training and inference for large models while simplifying the ML pipeline development process without necessitating MLSys expertise from users.
Check out our Tech Report for details! Here is also a Quick Tutorial for you to become an expert of distributed training with Redco in several minutes!
- Redco allows for the simple implementation of distributed training and inference, eliminating the need for additional coding efforts or complex configurations, but still exhibits efficiency comparable to the most advanced model parallel tools.
- Redco enables customization of arbitrary ML pipelines within three functions, eliminating repetitive ans boilerplate coding, such as multi-host related processing, etc. We demonstrate that this mechanism is widely applicable to various ML algorithms
- The backend of Redco is based on JAX, but users doesn't need to be JAX experts. Knowing
numpy
is good enough!
Installation
Install RedCoast
pip install redco
Adjust Jax to GPU/TPU version
The command above would automatically install cpu version of jax, so the version of Jax need to be adjusted based on your device. For example, on GPUs,
# for cuda-12.x
pip install --upgrade "jax[cuda12]"
# for cuda-11.x
pip install --upgrade jax[cuda11_pip] -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
If you are using TPU/CPU/AMD/Apple, see here for corresponding installation commands.
Examples
Examples across a set of paradigms can be found in examples/, including
- Classification/regression (GLUE & MNIST)
- Faderated learning (FedAvg)
- Image to text (Image captioning)
- Language modeling (Instruction Tuning of LLMs)
- Meta learning (MAML)
- Reinforcement learning (PPO & DDPG)
- Text to image (StableDiffusion)
- Text to text (Seq2seq)
Exemplar large model settings
The table below shows runnable model LLM finetuning on different kinds of servers. Numbers inside the brackets are the maximum length in training. All the settings are with full precision (fp32) and Adam optimizer.
2 $\times$ 1080Ti (2 $\times$ 10G) |
4 $\times$ A100 (4 $\times$ 40G) |
2 $\times$ TPU-v4 (2 hosts $\times$ 4 chips $\times$ 32G) |
16 $\times$ TPU-v4 (16 hosts $\times$ 4 chips $\times$ 32G) |
---|---|---|---|
BART-Large (1024) | LLaMA-7B (1024) | T5-XL-11B (512) | OPT-66B (512) |
GPT2-Large (512) | GPT-J-6B (1024) | OPT-13B (1024) |
Go to example/language_modeling and examples/text_to_text to try them out!
Reference
We now have a paper you can cite for the Red Coast library:
RedCoast: A Lightweight Tool to Automate Distributed Training of LLMs on Any GPU/TPUs
Bowen Tan, Yun Zhu, Lijuan Liu, Hongyi Wang, Yonghao Zhuang, Jindong Chen, Eric Xing, Zhiting Hu
NAACL 2024, Demo
Mlsys Workshop @ NeurIPS 2023
@article{tan2023redco,
title={RedCoast: A Lightweight Tool to Automate Distributed Training of LLMs on Any GPU/TPUs},
author={Tan, Bowen and Zhu, Yun and Liu, Lijuan and Wang, Hongyi and Zhuang, Yonghao and Chen, Jindong and Xing, Eric and Hu, Zhiting},
journal={arXiv preprint arXiv:2310.16355},
year={2023}
}
Acknowledgement
The name of this package is inspired by Red Coast Base, a key location in the story of Three-Body. From Red Coast Base, humanity broadcasts its first message into the vast universe. We thank Cixin Liu for such a masterpiece!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file redco-0.4.17.tar.gz
.
File metadata
- Download URL: redco-0.4.17.tar.gz
- Upload date:
- Size: 20.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b82d9c769ff662df4d0c8dbd7718e8b13af5a4708dd94fc3badb195d0949e18f |
|
MD5 | 986252bbf59b6559e4133af4e7b8c318 |
|
BLAKE2b-256 | d2368612f27e6c3f6c28825066032023a282e346d19c199e78426457d71cee18 |
File details
Details for the file redco-0.4.17-py3-none-any.whl
.
File metadata
- Download URL: redco-0.4.17-py3-none-any.whl
- Upload date:
- Size: 26.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.57.0 urllib3/1.26.5 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | bdcb41e21e1ab69960ccee685aec73c572c384f47c62aa69a01fabb4927b2542 |
|
MD5 | e1467cc6989eaa219b46861b9952cf65 |
|
BLAKE2b-256 | fce80c16e7bc977d84b7bff9bab27bfc686e896d53ae620c6893469229bbdede |