Skip to main content

An open-source tool for teams to build reproducible ML workflows

Project description

dstack

Reproducible ML workflows

dstack is an open-source tool that allows running reproducible ML workflows independently of the environment (locally or in the cloud), and collaborate around data and models.

Slack

DocsQuick startBasicsSlack

Last commit PyPI - License

dstack is an open-source tool that allows running reproducible ML workflows independently of the environment. It allows running ML workflows locally or remotely (e.g. in a configured cloud account). Additionally, dstack facilitates versioning and reuse of artifacts (such as data and models), across teams.

In brief, dstack simplifies the process of establishing ML training pipelines that are independent of a particular vendor, and facilitates collaboration within teams on data and models.

How does it work?

  • Define workflows via YAML
  • Run workflows locally via CLI
  • Track and reuse artifacts across workflows
  • Run workflows remotely (in any configured cloud) via CLI
  • Version and share artifacts across teams

Installation

Use pip to install the dstack CLI:

pip install dstack --upgrade

Example

Here's an example from the Quick start.

workflows:
  - name: mnist-data
    provider: bash
    commands:
      - pip install torchvision
      - python mnist/mnist_data.py
    artifacts:
      - path: ./data

  - name: train-mnist
    provider: bash
    deps:
      - workflow: mnist-data
    commands:
      - pip install torchvision pytorch-lightning tensorboard
      - python mnist/train_mnist.py
    artifacts:
      - path: ./lightning_logs

With workflows defined in this manner, dstack allows for effortless execution either locally or in a configured cloud account, while also enabling reuse of artifacts.

Run locally

Use the dstack CLI to run workflows locally:

dstack run mnist-data

Configure a remote

To run workflows remotely (e.g. in the cloud) or share artifacts outside your machine, you must configure your remote settings using the dstack config command:

dstack config

This command will ask you to choose an AWS profile (which will be used for AWS credentials), an AWS region (where workflows will be run), and an S3 bucket (to store remote artifacts and metadata).

AWS profile: default
AWS region: eu-west-1
S3 bucket: dstack-142421590066-eu-west-1
EC2 subnet: none

For more details on how to configure a remote, check the installation guide.

Run remotely

Once a remote is configured, use the --remote flag with the dstack run command to run the workflow in the configured cloud:

dstack run mnist-data --remote

You can configure the required resources to run the workflows either via the resources property in YAML or the dstack run command's arguments, such as --gpu, --gpu-name, etc:

dstack run train-mnist --remote --gpu 1

When you run a workflow remotely, dstack automatically creates resources in the configured cloud, and releases them once the workflow is finished.

More information

For additional information and examples, see the following links:

Licence

Mozilla Public License 2.0

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

dstack-0.1.1-py3-none-any.whl (13.1 MB view details)

Uploaded Python 3

File details

Details for the file dstack-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: dstack-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 13.1 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for dstack-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0f48dabb218a54e9214bfde505f4127515249beb509a2f8e5cf2824512b1fd73
MD5 3769e2549a26bd6c8c3529382d19bb3a
BLAKE2b-256 4054b318853a3dc582929887d2c47bc6ff9bdb74fdd6fb88ec28ca77d3bc79d4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page