OpenPilot: An intercloud broker for the clouds
Project description
Run LLMs and AI on Any Cloud
:fire: News :fire:
- [July, 2023] Self-Hosted LLaMA 2 Chatbot on Any Cloud: example
- [June, 2023] Serving LLM 24x Faster On the Cloud with vLLM and SkyPilot: example, blog post
- [June, 2023] Two new clouds supported: Samsung SCP and Oracle OCI!
- [April, 2023] SkyPilot YAMLs released for finetuning & serving the Vicuna model with a single command!
- [March, 2023] Vicuna LLM chatbot trained using SkyPilot for $300 on spot instances!
- [March, 2023] Serve your own LLaMA LLM chatbot (not finetuned) on any cloud: example, repo
SkyPilot is a framework for running LLMs, AI, and batch jobs on any cloud, offering maximum cost savings, highest GPU availability, and managed execution.
SkyPilot abstracts away cloud infra burdens:
- Launch jobs & clusters on any cloud
- Easy scale-out: queue and run many jobs, automatically managed
- Easy access to object stores (S3, GCS, R2)
SkyPilot maximizes GPU availability for your jobs:
- Provision in all zones/regions/clouds you have access to (the Sky), with automatic failover
SkyPilot cuts your cloud costs:
- Managed Spot: 3-6x cost savings using spot VMs, with auto-recovery from preemptions
- Optimizer: 2x cost savings by auto-picking the cheapest VM/zone/region/cloud
- Autostop: hands-free cleanup of idle clusters
SkyPilot supports your existing GPU, TPU, and CPU workloads, with no code changes.
Install with pip or from source:
pip install "skypilot[aws,gcp,azure,ibm,oci,scp,lambda]" # choose your clouds
Current supported providers (AWS, Azure, GCP, Lambda Cloud, IBM, Samsung, OCI, Cloudflare):
Getting Started
You can find our documentation here.
SkyPilot in 1 Minute
A SkyPilot task specifies: resource requirements, data to be synced, setup commands, and the task commands.
Once written in this unified interface (YAML or Python API), the task can be launched on any available cloud. This avoids vendor lock-in, and allows easily moving jobs to a different provider.
Paste the following into a file my_task.yaml
:
resources:
accelerators: V100:1 # 1x NVIDIA V100 GPU
num_nodes: 1 # Number of VMs to launch
# Working directory (optional) containing the project codebase.
# Its contents are synced to ~/sky_workdir/ on the cluster.
workdir: ~/torch_examples
# Commands to be run before executing the job.
# Typical use: pip install -r requirements.txt, git clone, etc.
setup: |
pip install torch torchvision
# Commands to run as a job.
# Typical use: launch the main program.
run: |
cd mnist
python main.py --epochs 1
Prepare the workdir by cloning:
git clone https://github.com/pytorch/examples.git ~/torch_examples
Launch with sky launch
(note: access to GPU instances is needed for this example):
sky launch my_task.yaml
SkyPilot then performs the heavy-lifting for you, including:
- Find the lowest priced VM instance type across different clouds
- Provision the VM, with auto-failover if the cloud returned capacity errors
- Sync the local
workdir
to the VM - Run the task's
setup
commands to prepare the VM for running the task - Run the task's
run
commands
Refer to Quickstart to get started with SkyPilot.
More Information
To learn more, see our Documentation and Tutorials.
Runnable examples:
- LLMs on SkyPilot
- Self-Hosted LLaMA 2 Chatbot
- Vicuna chatbots: Training & Serving (from official Vicuna team)
- vLLM: Serving LLM 24x Faster On the Cloud (from official vLLM team)
- QLoRA
- LLaMA-LoRA-Tuner
- Tabby: Self-hosted AI coding assistant
- LocalGPT
- Add yours here & see more in
llm/
!
- Framework examples: PyTorch DDP, DeepSpeed, JAX/Flax on TPU, Stable Diffusion, Detectron2, Distributed TensorFlow, programmatic grid search, Docker, and many more (
examples/
).
Follow updates:
Read the research:
- SkyPilot paper and talk (NSDI 2023)
- Sky Computing whitepaper
- Sky Computing vision paper (HotOS 2021)
Support and Questions
We are excited to hear your feedback!
- For issues and feature requests, please open a GitHub issue.
- For questions, please use GitHub Discussions.
For general discussions, join us on the SkyPilot Slack.
Contributing
We welcome and value all contributions to the project! Please refer to CONTRIBUTING for how to get involved.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file opilot-1.0.0.dev0.tar.gz
.
File metadata
- Download URL: opilot-1.0.0.dev0.tar.gz
- Upload date:
- Size: 556.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d109a387a6db98f778a508acde7fc73add8fd3230d3fb324ef38005541af3b3d |
|
MD5 | 4130144099057f7cf4abb06b0990ea13 |
|
BLAKE2b-256 | 9dbd09fa432629f9fb694bed5dd95e1539a8fa57944ee31a8596324666dc3419 |
File details
Details for the file opilot-1.0.0.dev0-py3-none-any.whl
.
File metadata
- Download URL: opilot-1.0.0.dev0-py3-none-any.whl
- Upload date:
- Size: 614.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f75bd44d6c53934a775aaed7c9c150f797e4f08361a8978e512f6b84b0c4876f |
|
MD5 | aff555f1362caa9f6ed7e5727735a561 |
|
BLAKE2b-256 | ee7b191d471d17309d2c588e7e439f76e0f5daaef0fab4013c528e83ff2c74c5 |