Skip to main content

dstack is an open-source orchestration engine for running AI workloads on any cloud or on-premises.

Project description

dstack is a lightweight alternative to Kubernetes, designed specifically for managing the development, training, and deployment of AI models at any scale.

dstack is easy to use with any cloud provider (AWS, GCP, Azure, OCI, Lambda, TensorDock, Vast.ai, RunPod, etc.) or any on-prem clusters.

If you already use Kubernetes, dstack can be used with it.

Accelerators

dstack supports NVIDIA GPU, AMD GPU, and Google Cloud TPU out of the box.

Major news ✨

Installation

Before using dstack through CLI or API, set up a dstack server.

1. Configure backends

If you want the dstack server to run containers or manage clusters in your cloud accounts (or use Kubernetes), create the ~/.dstack/server/config.yml file and configure backends.

2. Start the server

Once the ~/.dstack/server/config.yml file is configured, proceed to start the server:

$ pip install "dstack[all]" -U
$ dstack server

Applying ~/.dstack/server/config.yml...

The admin token is "bbae0f28-d3dd-4820-bf61-8f4bb40815da"
The server is running at http://127.0.0.1:3000/

Note It's also possible to run the server via Docker.

The dstack server can run anywhere: on your laptop, a dedicated server, or in the cloud. Once it's up, you can use either the CLI or the API.

3. Set up the CLI

To point the CLI to the dstack server, configure it with the server address, user token, and project name:

$ pip install dstack
$ dstack config --url http://127.0.0.1:3000 \
    --project main \
    --token bbae0f28-d3dd-4820-bf61-8f4bb40815da
    
Configuration is updated at ~/.dstack/config.yml

4. Create on-prem fleets

If you want the dstack server to run containers on your on-prem servers, use fleets.

How does it work?

Before using dstack, install the server and configure backends.

1. Define configurations

dstack supports the following configurations:

  • Dev environments — for interactive development using a desktop IDE
  • Tasks — for scheduling jobs (incl. distributed jobs) or running web apps
  • Services — for deployment of models and web apps (with auto-scaling and authorization)
  • Fleets — for managing cloud and on-prem clusters
  • Volumes — for managing persisted volumes
  • Gateways — for configuring the ingress traffic and public endpoints

Configuration can be defined as YAML files within your repo.

2. Apply configurations

Apply the configuration either via the dstack apply CLI command or through a programmatic API.

dstack automatically manages provisioning, job queuing, auto-scaling, networking, volumes, run failures, out-of-capacity errors, port-forwarding, and more — across clouds and on-prem clusters.

More information

For additional information and examples, see the following links:

Contributing

You're very welcome to contribute to dstack. Learn more about how to contribute to the project at CONTRIBUTING.md.

License

Mozilla Public License 2.0

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dstack-0.18.11.tar.gz (14.9 MB view details)

Uploaded Source

Built Distribution

dstack-0.18.11-py3-none-any.whl (15.1 MB view details)

Uploaded Python 3

File details

Details for the file dstack-0.18.11.tar.gz.

File metadata

  • Download URL: dstack-0.18.11.tar.gz
  • Upload date:
  • Size: 14.9 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for dstack-0.18.11.tar.gz
Algorithm Hash digest
SHA256 21a64e1d2298a5586be5f930399f959a3560ca9f5f935a5a67530e12a55c3743
MD5 f6851c0d9e4166df3e4ebcf7170b0703
BLAKE2b-256 a8b68c51c9e7a5e9606f00941e6068aed8583922911130e36f748369e9683ac5

See more details on using hashes here.

File details

Details for the file dstack-0.18.11-py3-none-any.whl.

File metadata

  • Download URL: dstack-0.18.11-py3-none-any.whl
  • Upload date:
  • Size: 15.1 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for dstack-0.18.11-py3-none-any.whl
Algorithm Hash digest
SHA256 9d1f1f8dcba11227abf98058c6e883a37508dbdf43518cfd6f1cbc872839fda2
MD5 c74de49d6740e1214aa77c668ee58370
BLAKE2b-256 3f94e90d0f5d4745055f477ccb4dcdc86265bceaac0f3c2b03d7cc2564f650d2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page