The hassle-free tool for managing ML workflows on any cloud platform.
Project description
Automate your ML workflows on any cloud
The hassle-free tool for managing ML workflows on any cloud platform.
Docs • Quick start • Playground • Setup • Usage • Examples
What is dstack?
dstack
is an open-source tool that automates ML workflows, enabling effective management on any cloud platform.
It empowers your team to prepare data, train, and fine-tune models using their preferred frameworks and dev environments without spending time on engineering and infrastructure.
Install the CLI
Use pip
to install dstack
:
pip install dstack
Configure a remote
By default, workflows run locally. To run workflows remotely (e.g. in a configured cloud account),
configure a remote using the dstack config
command.
dstack config
? Choose backend. Use arrows to move, type to filter
> [aws]
[gcp]
[hub]
Choose hub
if you prefer managing cloud credentials and settings through a user
interface while working in a team.
For running remote workflows with local cloud credentials, select aws
or gcp
.
Define workflows
Define ML workflows, their output artifacts, hardware requirements, and dependencies via YAML.
workflows:
- name: train-mnist
provider: bash
commands:
- pip install torchvision pytorch-lightning tensorboard
- python examples/mnist/train_mnist.py
artifacts:
- path: ./lightning_logs
Run locally
By default, workflows run locally on your machine.
dstack run train-mnist
RUN WORKFLOW SUBMITTED STATUS TAG BACKENDS
penguin-1 train-mnist now Submitted local
Provisioning... It may take up to a minute. ✓
To interrupt, press Ctrl+C.
GPU available: True, used: True
Epoch 1: [00:03<00:00, 280.17it/s, loss=1.35, v_num=0]
Run remotely
To run a workflow remotely (e.g. in a configured cloud account), add the --remote
flag to the dstack run
command:
The necessary hardware resources can be configured either via YAML or through arguments in the dstack run
command, such
as --gpu
and --gpu-name
.
dstack run train-mnist --remote --gpu 1
RUN WORKFLOW SUBMITTED STATUS TAG BACKENDS
turtle-1 train-mnist now Submitted aws
Provisioning... It may take up to a minute. ✓
To interrupt, press Ctrl+C.
GPU available: True, used: True
Epoch 1: [00:03<00:00, 280.17it/s, loss=1.35, v_num=0]
Upon running a workflow remotely, dstack
automatically creates resources in the configured cloud account and destroys them
once the workflow is complete.
Providers
dstack
supports multiple providers to set up environments, run scripts, and launch interactive development environments and applications.
Artifacts
dstack
allows you to save output artifacts and conveniently reuse them across workflows.
More information
For additional information and examples, see the following links:
Licence
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file dstack-0.7.2.tar.gz
.
File metadata
- Download URL: dstack-0.7.2.tar.gz
- Upload date:
- Size: 118.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d660993aea4d7c8f2a06da599767e1ccd685c2b2f87babc45c5833bba39c54db |
|
MD5 | c576aaa2cb7a00ce983d1300ede31621 |
|
BLAKE2b-256 | ef425a2fb84e3f1b1e2759ce05d8c669b06563d8e4c2987d24a5e66758088474 |
File details
Details for the file dstack-0.7.2-py3-none-any.whl
.
File metadata
- Download URL: dstack-0.7.2-py3-none-any.whl
- Upload date:
- Size: 13.6 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e3b3615f67275a63090b2d48f5fbdd48a928afc3e77dc94e5ef83adf254f62f5 |
|
MD5 | e809e2b7c4a9ef3055967184dbd1d947 |
|
BLAKE2b-256 | c1477c832e4a7fb77d8d9fc7b26e0947788833eb2d4209b362528adaa4a4f8b2 |