Skip to main content

dstack is an open-source orchestration engine for running AI workloads on any cloud or on-premises.

Project description

dstack

Orchestrate AI workloads in any cloud

DocsExamplesDiscord

Last commit PyPI - License

dstack is an open-source container orchestration engine designed for AI workloads across any cloud or data center.

The supported cloud providers include AWS, GCP, Azure, Lambda, TensorDock, Vast.ai, CUDO, and RunPod. You can also use dstack ro run workloads on on-prem servers.

Latest news ✨

Installation

Before using dstack through CLI or API, set up a dstack server.

Install the server

The easiest way to install the server, is via pip:

pip install "dstack[all]" -U

Configure backends

If you have default AWS, GCP, or Azure credentials on your machine, the dstack server will pick them up automatically.

Otherwise, you need to manually specify the cloud credentials in ~/.dstack/server/config.yml.

See the server/config.yml reference for details on how to configure backends for all supported cloud providers.

Start the server

To start the server, use the dstack server command:

$ dstack server

Applying ~/.dstack/server/config.yml...

The admin token is "bbae0f28-d3dd-4820-bf61-8f4bb40815da"
The server is running at http://127.0.0.1:3000/

Note It's also possible to run the server via Docker.

CLI & API

Once the server is up, you can use either dstack's CLI or API to run workloads. Below is a live demo of how it works with the CLI.

Dev environments

You specify the required environment and resources, then run it. dstack provisions the dev environment in the cloud and enables access via your desktop IDE.

Tasks

Tasks allow for convenient scheduling of any kind of batch jobs, such as training, fine-tuning, or data processing, as well as running web applications.

Specify the environment and resources, then run it. dstack executes the task in the cloud, enabling port forwarding to your local machine for convenient access.

Services

Services make it very easy to deploy any kind of model or web application as public endpoints.

Use any serving frameworks and specify required resources. dstack deploys it in the configured backend, handles authorization, and provides an OpenAI-compatible interface if needed.

Pools

Pools simplify managing the lifecycle of cloud instances and enable their efficient reuse across runs.

You can have instances provisioned in the cloud automatically, or add them manually, configuring the required resources, idle duration, etc.

Examples

Here are some featured examples:

Browse examples for more examples.

More information

For additional information and examples, see the following links:

Licence

Mozilla Public License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chip_wallet_dstack-1.3.tar.gz (273.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chip_wallet_dstack-1.3-py3-none-any.whl (427.0 kB view details)

Uploaded Python 3

File details

Details for the file chip_wallet_dstack-1.3.tar.gz.

File metadata

  • Download URL: chip_wallet_dstack-1.3.tar.gz
  • Upload date:
  • Size: 273.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.3

File hashes

Hashes for chip_wallet_dstack-1.3.tar.gz
Algorithm Hash digest
SHA256 41430b8fe89c544aee7de4cd6a803d2708499d10a6758274b571ff666a02bdd4
MD5 7ed34dad09aa1e30507b7ad83d68859e
BLAKE2b-256 cf9f1d18b5d4232090c931ff8380c8d467a6ebb546acc28de7da2b1d36054da0

See more details on using hashes here.

File details

Details for the file chip_wallet_dstack-1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for chip_wallet_dstack-1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 377c0385a1326b0f807d7121311568af08f1e36f05f609b3015a9084ea89a3be
MD5 afdb6091b420a2d5f461e16625f32213
BLAKE2b-256 adc6f78228908193ed71dd2a10ebd1cb671bf539d286083974b24f1f4a7e5c58

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page