Skip to main content

A command-line utility to provision infrastructure for ML workflows

Project description

dstack is a lightweight command-line utility to provision infrastructure for ML workflows.

Features

  • Define your ML workflows declaratively, incl. their dependencies, environment, and required compute resources
  • Run workflows via the dstack CLI. Have infrastructure provisioned automatically in a configured cloud account.
  • Save output artifacts, such as data and models, and reuse them in other ML workflows
  • Use dstack to process data, train models, host apps, and launch dev environments

How does it work?

  1. Install dstack locally
  2. Define ML workflows in .dstack/workflows.yaml (within your existing Git repository)
  3. Run ML workflows via the dstack run CLI command
  4. Use other dstack CLI commands to manage runs, artifacts, etc.

When you run an ML workflow via the dstack CLI, it provisions the required compute resources (in a configured cloud account), sets up environment (such as Python, Conda, CUDA, etc), fetches your code, downloads deps, saves artifacts, and tears down compute resources.

Installation

Use pip to install dstack locally:

pip install dstack

The dstack CLI needs your AWS account credentials to be configured locally (e.g. in ~/.aws/credentials or AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables).

Before you can use the dstack CLI, you need to configure it:

dstack config

It will prompt you to select the AWS region where dstack will provision compute resources, and the S3 bucket, where dstack will save data.

? Choose AWS region
✓ Europe, Ireland [eu-west-1]
? Choose S3 bucket
✓ Default [dstack-142421590066-eu-west-1]
? Choose EC2 subnet
✓ Default [no preference]

Support for GCP and Azure is in the roadmap.

Usage example

Say, you have a Python script that trains a model. It loads data from a local folder and saves the checkpoints into another folder.

Now, to make it possible to run it via dstack, you have to create a .dstack/workflows.yaml file, and define there how to run the script, where to load the data, how to store output artifacts, and what compute resources are needed to run it.

workflows: 
  - name: train
    provider: bash
    deps:
      - tag: mnist_data
    commands:
      - pip install requirements.txt
      - python src/train.py
    artifacts: 
      - path: checkpoint
    resources:
      interruptible: true
      gpu: 1

Now you can run it via the dstack CLI:

dstack run train

You'll see the output in real-time as your workflow is running.

Provisioning... It may take up to a minute. ✓

To interrupt, press Ctrl+C.

Epoch 4: 100%|██████████████| 1876/1876 [00:17<00:00, 107.85it/s, loss=0.0944, v_num=0, val_loss=0.108, val_acc=0.968]

`Trainer.fit` stopped: `max_epochs=5` reached.

Testing DataLoader 0: 100%|██████████████| 313/313 [00:00<00:00, 589.34it/s]

Test metric   DataLoader 0
val_acc       0.965399980545044
val_loss      0.10975822806358337

Use the dstack ps command to see the status of recent workflows.

dstack ps -a

RUN               TARGET    STATUS   ARTIFACTS   APPS  SUBMITTED    TAG
angry-elephant-1  download  Done     data              8 hours ago  mnist_data
wet-insect-1      train     Running  checkpoint        1 weeks ago

Other CLI commands allow to manage runs, artifacts, tags, secrets, and more.

You can use dstack to not only process data or train models, but also to run applications, and dev environments.

All the state and output artifacts are stored in a configured S3 bucket.

More information

Licence

Mozilla Public License 2.0

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dstack-0.0.9rc3.tar.gz (53.1 kB view details)

Uploaded Source

Built Distribution

dstack-0.0.9rc3-py3-none-any.whl (6.7 MB view details)

Uploaded Python 3

File details

Details for the file dstack-0.0.9rc3.tar.gz.

File metadata

  • Download URL: dstack-0.0.9rc3.tar.gz
  • Upload date:
  • Size: 53.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.14

File hashes

Hashes for dstack-0.0.9rc3.tar.gz
Algorithm Hash digest
SHA256 ba758815f65f9e3727e87833a5849733fab7fff42df11d0fb9f6b5075f80bad8
MD5 cfbbbcac08149a0cadb4c85240cd3b54
BLAKE2b-256 15bf5ab1cf6048b374fe291ec1374172d0343abec91b253f5c980fbae5a7d7b3

See more details on using hashes here.

File details

Details for the file dstack-0.0.9rc3-py3-none-any.whl.

File metadata

  • Download URL: dstack-0.0.9rc3-py3-none-any.whl
  • Upload date:
  • Size: 6.7 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.14

File hashes

Hashes for dstack-0.0.9rc3-py3-none-any.whl
Algorithm Hash digest
SHA256 e44a1b4b8e514dbc17c2c3a0e2165d349cceb58dbe4e1c4a339835f15f615ac9
MD5 10d08cf5a58622fd2adeda9c7c96172c
BLAKE2b-256 99a59c4dba10e9c30649d0080e1fec8775b61098851d232e76e4a856dedb0f6f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page