Skip to main content

dstack is an open-source orchestration engine for running AI workloads on any cloud or on-premises.

Project description

dstack is an open-source container orchestration engine designed for running AI workloads across any cloud or data center. It simplifies dev environments, running tasks on clusters, and deployment.

The supported cloud providers include AWS, GCP, Azure, OCI, Lambda, TensorDock, Vast.ai, RunPod, and CUDO. You can also use dstack to run workloads on on-prem clusters.

dstack natively supports NVIDIA GPU, and Google Cloud TPU accelerator chips.

Latest news ✨

Installation

Before using dstack through CLI or API, set up a dstack server.

Install the server

The easiest way to install the server, is via pip:

pip install "dstack[all]" -U

Configure backends

If you have default AWS, GCP, Azure, or OCI credentials on your machine, the dstack server will pick them up automatically.

Otherwise, you need to manually specify the cloud credentials in ~/.dstack/server/config.yml.

See the server/config.yml reference for details on how to configure backends for all supported cloud providers.

Start the server

To start the server, use the dstack server command:

$ dstack server

Applying ~/.dstack/server/config.yml...

The admin token is "bbae0f28-d3dd-4820-bf61-8f4bb40815da"
The server is running at http://127.0.0.1:3000/

Note It's also possible to run the server via Docker.

CLI & API

Once the server is up, you can use either dstack's CLI or API to run workloads. Below is a live demo of how it works with the CLI.

Dev environments

You specify the required environment and resources, then run it. dstack provisions the dev environment in the cloud and enables access via your desktop IDE.

Tasks

Tasks allow for convenient scheduling of any kind of batch jobs, such as training, fine-tuning, or data processing, as well as running web applications.

Specify the environment and resources, then run it. dstack executes the task in the cloud, enabling port forwarding to your local machine for convenient access.

Services

Services make it very easy to deploy any kind of model or web application as public endpoints.

Use any serving frameworks and specify required resources. dstack deploys it in the configured backend, handles authorization, and provides an OpenAI-compatible interface if needed.

Pools

Pools simplify managing the lifecycle of cloud instances and enable their efficient reuse across runs.

You can have instances provisioned in the cloud automatically, or add them manually, configuring the required resources, idle duration, etc.

Examples

Here are some featured examples:

Browse examples for more examples.

More information

For additional information and examples, see the following links:

Contributing

We welcome contributions to dstack! To learn more about getting involved in the project, please refer to CONTRIBUTING.md.

License

Mozilla Public License 2.0

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dstack-0.18.5.tar.gz (292.5 kB view details)

Uploaded Source

Built Distribution

dstack-0.18.5-py3-none-any.whl (440.8 kB view details)

Uploaded Python 3

File details

Details for the file dstack-0.18.5.tar.gz.

File metadata

  • Download URL: dstack-0.18.5.tar.gz
  • Upload date:
  • Size: 292.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.19

File hashes

Hashes for dstack-0.18.5.tar.gz
Algorithm Hash digest
SHA256 4f1740c85ba9e41f4f352759032b28272917a504f0f9f1adf155e4363f5f96ad
MD5 7182800f320a8f0fcf04065d8a740b72
BLAKE2b-256 feddecd7d05a9648f66e3f49944a9ea5233ce6ce93abcf9325d40abe2f16ff01

See more details on using hashes here.

File details

Details for the file dstack-0.18.5-py3-none-any.whl.

File metadata

  • Download URL: dstack-0.18.5-py3-none-any.whl
  • Upload date:
  • Size: 440.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.19

File hashes

Hashes for dstack-0.18.5-py3-none-any.whl
Algorithm Hash digest
SHA256 c44e108221f865b8c407c45ad4795073fc3068bdc63f3cd1a329f4e5a28fdcb0
MD5 db2cca3c59239727c241ba7d61d39715
BLAKE2b-256 a8334a63842ea67e3e7f309f0ed67084a562049b93e465fe8094d768b0e881f8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page