Develop ML faster. Easily and cost-effectively run dev environments, pipelines, and apps on any cloud.
Project description
dstack
makes it very easy for ML engineers to run dev environments, pipelines and apps cost-effectively
on any cloud.
Installation and setup
To use dstack
, install it with pip
and start the Hub application.
pip install dstack
dstack start
The dstack start
command starts the Hub server, and creates the default project to run everything locally.
To enable Hub to run dev environments, pipelines, and apps in your preferred cloud account (AWS, GCP, Azure, etc), log in to Hub, and configure the corresponding project.
Running a dev environment
A dev environment is a virtual machine that includes the environment and an interactive IDE or notebook setup based on a pre-defined configuration.
Go ahead and define this configuration via YAML (under the .dstack/workflows
folder).
workflows:
- name: code-gpu
provider: code
setup:
- pip install -r dev-environments/requirements.txt
resources:
gpu:
count: 1
The YAML file allows you to configure hardware resources, set up the Python environment, expose ports, configure cache, and many more.
Now, you can start it using the dstack run
command:
$ dstack run code-gpu
RUN WORKFLOW SUBMITTED STATUS TAG
shady-1 code-gpu now Submitted
Starting SSH tunnel...
To exit, press Ctrl+C.
Web UI available at http://127.0.0.1:51845/?tkn=4d9cc05958094ed2996b6832f899fda1
If you configure a project to run dev environments in the cloud, dstack
will automatically provision the
required cloud resources, and forward ports of the dev environment to your local machine.
When you stop the dev environment, dstack
will automatically clean up cloud resources.
Running a pipeline
A pipeline is a set of pre-defined configurations that allow to process data, train or fine-tune models, do batch inference or other tasks.
Go ahead and define such a configuration via YAML (under the .dstack/workflows
folder).
workflows:
- name: train-mnist-gpu
provider: bash
commands:
- pip install -r pipelines/requirements.txt
- python pipelines/train.py
artifacts:
- ./lightning_logs
resources:
gpu:
count: 1
The YAML file allows you to configure hardware resources and output artifacts, set up the Python environment, expose ports, configure cache, and many more.
Now, you can run the pipeline using the dstack run
command:
$ dstack run train-mnist-gpu
RUN WORKFLOW SUBMITTED STATUS TAG
shady-1 train-mnist-gpu now Submitted
Provisioning... It may take up to a minute. ✓
GPU available: True, used: True
Epoch 1: [00:03<00:00, 280.17it/s, loss=1.35, v_num=0]
If you configure a project to run pipelines in the cloud, the dstack run
command will automatically provision the
required cloud resources.
After the pipeline is stopped or finished, dstack
will save output artifacts and clean up cloud resources.
Running an app
An app can be either a web application (such as Streamlit, Gradio, etc.) or an API endpoint (like FastAPI, Flask, etc.) setup based on a pre-defined configuration.
Go ahead and define this configuration via YAML (under the .dstack/workflows
folder).
workflows:
- name: fastapi-gpu
provider: bash
ports: 1
commands:
- pip install -r apps/requirements.txt
- uvicorn apps.main:app --port $PORT_0 --host 0.0.0.0
resources:
gpu:
count: 1
The configuration allows you to customize hardware resources, set up the Python environment, configure cache, and more.
Now, you can run the app using the dstack run
command:
$ dstack run fastapi-gpu
RUN WORKFLOW SUBMITTED STATUS TAG
silly-dodo-1 fastapi-gpu now Submitted
Starting SSH tunnel...
To interrupt, press Ctrl+C.
INFO: Started server process [1]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:63475 (Press CTRL+C to quit)
If you configure a project to run apps in the cloud, dstack
will automatically provision the required cloud
resources, and forward ports of the app to your local machine.
If you stop the app, it will automatically clean up cloud resources.
More information
For additional information and examples, see the following links:
Licence
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file dstack-0.9.post1.tar.gz
.
File metadata
- Download URL: dstack-0.9.post1.tar.gz
- Upload date:
- Size: 137.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 91626ae670939a688e0acde177af9cec3ae697da0bf9b56ee42f8c40531c892f |
|
MD5 | 08c08ff2fac46918233758b62242307e |
|
BLAKE2b-256 | 8f3351159a0ccb0f3afa597b1288d1b6c5457782cc77d5b6ad328af94c001e8e |
File details
Details for the file dstack-0.9.post1-py3-none-any.whl
.
File metadata
- Download URL: dstack-0.9.post1-py3-none-any.whl
- Upload date:
- Size: 13.7 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f0d78405084c86380f8102b2a4765af5d7acb2cac2d4c33950acd7bd54db5629 |
|
MD5 | bc1d10c81eaba1f47d0f66aff502549c |
|
BLAKE2b-256 | e6e7b2a5f67beee1c38c5ae6f1da87c00591d0a7b1b1b043ff7b30f1c3bc2fa9 |