Skip to main content

The easiest way to define ML workflows and run them on any cloud platform

Project description

dstack

ML workflows as code

The easiest way to run ML workflows on any cloud platform

Slack

Quick startDocsTutorialsBlog

Last commit PyPI - License

What is dstack?

dstack makes it very easy to define ML workflows and run them on any cloud platform. It provisions infrastructure, manages data, and monitors usage for you.

Ideal for processing data, training models, running apps, and any other ML development tasks.

Installation and setup

To use dstack, install it with pip and start the Hub application.

pip install dstack
dstack start

The dstack start command starts the Hub application, and creates the default project to run workflows locally.

If you'll want to run workflows in the cloud (e.g. AWS, or GCP), simply log into the Hub application, and create a new project.

Run your first workflows

Let's define our first ML workflow in .dstack/workflows/hello.yaml:

workflows:
  - name: train-mnist
    provider: bash
    commands:
      - pip install torchvision pytorch-lightning tensorboard
      - python examples/mnist/train_mnist.py
    artifacts:
      - path: ./lightning_logs

The YAML file allows you to request hardware resources, run Python, save artifacts, use cache and
dependencies, create dev environments, run apps, and many more.

Run it

Go ahead and run it:

dstack run train-mnist

RUN        WORKFLOW     SUBMITTED  STATUS     TAG  BACKENDS
penguin-1  train-mnist  now        Submitted       local

Provisioning... It may take up to a minute. ✓

To interrupt, press Ctrl+C.

GPU available: False, used: False

Epoch 1: [00:03<00:00, 280.17it/s, loss=1.35, v_num=0]

The dstack run command runs the workflow using the settings specified for the project configured with the Hub application.

Create a Hub project

As mentioned above, the default project runs workflows locally. However, you can log into the application and create other projects that allow you to run workflows in the cloud.

If you want the project to use the cloud, you'll need to provide cloud credentials and specify settings such as the artifact storage bucket and the region where the workflows will run.

Once a project is created, copy the CLI command from the project settings and execute it in your terminal.

dstack config --url http://127.0.0.1:3000 \
  --project gcp \
  --token b934d226-e24a-4eab-a284-eb92b353b10f

The dstack config command configures dstack to run workflows using the settings from the corresponding project.

You can configure multiple projects and use them interchangeably (by passing the --project argument to the dstack run command. Any project can be set as the default by passing --default to the dstack config command.

Configuring multiple projects can be convenient if you want to run workflows both locally and in the cloud or if you would like to use multiple clouds.

Manage resources

Consider that you have configured a project that allows you to use a GPU (e.g., a local backend if you have a GPU locally, or an AWS or GCP backend).

Let's update our workflow and add resources.

workflows:
  - name: train-mnist
    provider: bash
    commands:
      - pip install torchvision pytorch-lightning tensorboard
      - python examples/mnist/train_mnist.py
    artifacts:
      - path: ./lightning_logs
    resources:
      gpu:
        name: V100
        count: 1

Let's run the workflow:

dstack run train-mnist --project gcp

RUN        WORKFLOW     SUBMITTED  STATUS     TAG  BACKENDS
penguin-1  train-mnist  now        Submitted       local

Provisioning... It may take up to a minute. ✓

To interrupt, press Ctrl+C.

GPU available: True, used: True

Epoch 1: [00:03<00:00, 280.17it/s, loss=1.35, v_num=0]

If your project is configured to use the cloud, the Hub application will automatically create the necessary cloud resources to execute the workflow and tear them down once it is finished.

More information

For additional information and examples, see the following links:

Licence

Mozilla Public License 2.0

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dstack-0.8.1.tar.gz (122.3 kB view hashes)

Uploaded Source

Built Distribution

dstack-0.8.1-py3-none-any.whl (13.7 MB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page