dstack is an open-source orchestration engine for running AI workloads on any cloud or on-premises.
Project description
dstack
is a streamlined alternative to Kubernetes and Slurm, specifically designed for AI. It simplifies container orchestration
for AI workloads both in the cloud and on-prem, speeding up the development, training, and deployment of AI models.
dstack
is easy to use with any cloud provider as well as on-prem servers.
Accelerators
dstack
supports NVIDIA GPU
, AMD GPU
, and Google Cloud TPU
out of the box.
Major news ✨
- [2024/10] dstack 0.18.17: on-prem AMD GPUs, AWS EFA, and more
- [2024/08] dstack 0.18.11: AMD, encryption, and more
- [2024/08] dstack 0.18.10: Control plane UI
- [2024/07] dstack 0.18.7: Fleets, RunPod volumes, dstack apply, and more
- [2024/05] dstack 0.18.4: Google Cloud TPU, and more
- [2024/05] dstack 0.18.2: On-prem clusters, private subnets, and more
Installation
Before using
dstack
through CLI or API, set up adstack
server. If you already have a runningdstack
server, you only need to set up the CLI.
(Optional) Configure backends
To use dstack
with your own cloud accounts, create the ~/.dstack/server/config.yml
file and
configure backends. Alternatively, you can configure backends via the control plane UI after you start the server.
You can skip backends configuration if you intend to run containers only on your on-prem servers. Use SSH fleets for that.
Start the server
Once the backends are configured, proceed to start the server:
$ pip install "dstack[all]" -U
$ dstack server
Applying ~/.dstack/server/config.yml...
The admin token is "bbae0f28-d3dd-4820-bf61-8f4bb40815da"
The server is running at http://127.0.0.1:3000/
For more details on server configuration options, see the server deployment guide.
Set up the CLI
To point the CLI to the dstack
server, configure it
with the server address, user token, and project name:
$ pip install dstack
$ dstack config --url http://127.0.0.1:3000 \
--project main \
--token bbae0f28-d3dd-4820-bf61-8f4bb40815da
Configuration is updated at ~/.dstack/config.yml
How does it work?
1. Define configurations
dstack
supports the following configurations:
- Dev environments — for interactive development using a desktop IDE
- Tasks — for scheduling jobs (incl. distributed jobs) or running web apps
- Services — for deployment of models and web apps (with auto-scaling and authorization)
- Fleets — for managing cloud and on-prem clusters
- Volumes — for managing persisted volumes
- Gateways — for configuring the ingress traffic and public endpoints
Configuration can be defined as YAML files within your repo.
2. Apply configurations
Apply the configuration either via the dstack apply
CLI command or through a programmatic API.
dstack
automatically manages provisioning, job queuing, auto-scaling, networking, volumes, run failures,
out-of-capacity errors, port-forwarding, and more — across clouds and on-prem clusters.
More information
For additional information and examples, see the following links:
Contributing
You're very welcome to contribute to dstack
.
Learn more about how to contribute to the project at CONTRIBUTING.md.
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file dstack-0.18.26.tar.gz
.
File metadata
- Download URL: dstack-0.18.26.tar.gz
- Upload date:
- Size: 15.1 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ef991e34bbca52a7f4f62cc86c9a37c4885fc5f3d991d9221fb2ff695f150e11 |
|
MD5 | ae256d3ede19b265afcff75a09312018 |
|
BLAKE2b-256 | 0685b12a7482412a7dbb7dc0ccb45783f0f201e9869e3623641d19da8c38604d |
File details
Details for the file dstack-0.18.26-py3-none-any.whl
.
File metadata
- Download URL: dstack-0.18.26-py3-none-any.whl
- Upload date:
- Size: 15.3 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d6ebd255987a7f8d84c6d5a59989e77bcf9ca781f54eef694ecbaf8c451418dd |
|
MD5 | 5f59b00edde0584e2ee92a5399d7f634 |
|
BLAKE2b-256 | 0d414416855fd66ad0e9dd850a9bd8dabd1c113a742b01fe5bf8fc59973a7b90 |