Skip to main content

An open-source tool for teams to build reproducible ML workflows

Project description

dstack

Easy-to-run & reproducible ML pipelines on any cloud

dstack is an open-source tool that streamlines the process of creating reproducible ML training pipelines that are independent of any specific vendor.

Slack

DocsQuick startBasicsSlack

Last commit PyPI - License

dstack is an open-source tool that streamlines the process of creating reproducible ML training pipelines that are independent of any specific vendor. It also promotes collaboration on data and models.

Highlighted features

  • Define ML pipelines via YAML and run either locally or on any cloud
  • Have cloud instances created and destroyed automatically
  • Use spot instances efficiently to save costs
  • Save artifacts and reuse them conveniently across workflows
  • Use interactive dev environments, such as notebooks or IDEs
  • Use any frameworks or experiment trackers. No code changes are required.
  • No need for Kubernetes or custom Docker images

Installation

Use pip to install the dstack CLI:

pip install dstack --upgrade

Example

Here's an example from the Quick start.

workflows:
  - name: mnist-data
    provider: bash
    commands:
      - pip install torchvision
      - python mnist/mnist_data.py
    artifacts:
      - path: ./data

  - name: train-mnist
    provider: bash
    deps:
      - workflow: mnist-data
    commands:
      - pip install torchvision pytorch-lightning tensorboard
      - python mnist/train_mnist.py
    artifacts:
      - path: ./lightning_logs

With workflows defined in this manner, dstack allows for effortless execution either locally or in a configured cloud account, while also enabling reuse of artifacts.

Run locally

Use the dstack CLI to run workflows locally:

dstack run mnist-data

Configure a remote

To run workflows remotely (e.g. in the cloud) or share artifacts outside your machine, you must configure your remote settings using the dstack config command:

dstack config

This command will ask you to choose an AWS profile (which will be used for AWS credentials), an AWS region (where workflows will be run), and an S3 bucket (to store remote artifacts and metadata).

AWS profile: default
AWS region: eu-west-1
S3 bucket: dstack-142421590066-eu-west-1
EC2 subnet: none

For more details on how to configure a remote, check the installation guide.

Run remotely

Once a remote is configured, use the --remote flag with the dstack run command to run the workflow in the configured cloud:

dstack run mnist-data --remote

You can configure the required resources to run the workflows either via the resources property in YAML or the dstack run command's arguments, such as --gpu, --gpu-name, etc:

dstack run train-mnist --remote --gpu 1

When you run a workflow remotely, dstack automatically creates resources in the configured cloud, and releases them once the workflow is finished.

More information

For additional information and examples, see the following links:

Licence

Mozilla Public License 2.0

Project details


Release history Release notifications | RSS feed

This version

0.1.2

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dstack-0.1.2.tar.gz (80.1 kB view hashes)

Uploaded Source

Built Distribution

dstack-0.1.2-py3-none-any.whl (13.6 MB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page