Skip to main content

Launcher for hypha services

Project description

hypha-launcher

Run triton server on HPC

Install with PyPi MIT license

Work In Progress

Features

  • CLI/API for:
    • downloading model from s3 and pulling docker image of triton server
    • launch s3 server
    • launch triton server
    • ...
  • Support different container engines
    • Docker
    • Apptainer
  • Support different compute environments
    • Local
    • Slurm

Installation

pip install hypha-launcher

CLI Usage

hypha-launcher --help

Launch the BioEngine Worker on HPC

BioEngine consists of a set of services that are used to serve AI models from bioimage.io. We provide the model test run feature accessible from https://bioimage.io and a dedicated bioengine web client: https://bioimage-io.github.io/bioengine-web-client/. While our public instance is openly accessible for testing and evaluation, you can run your own instance of the BioEngine worker to serve the models, e.g. with your own HPC computing resources.

Download all models from s3 and launch triton server.

Launch on HPC cluster. You need to set the job command template via the HYPHA_HPC_JOB_TEMPLATE environment variable for your own HPC cluster.

For example, here is an example for launching the BioEngine on a Slurm cluster:

# Please replace the job command with your own settings
export HYPHA_HPC_JOB_TEMPLATE="srun -A Your-Slurm-Account -t 03:00:00 --gpus-per-node A100:1 {cmd}"
python -m hypha_launcher launch_bioengine_worker --hypha-server-url https://ai.imjoy.io --triton-service-id my-triton

In the above example, the job command template is set to use the Slurm scheduler with the specified account and time limit. The {cmd} placeholder will be replaced with the actual command to launch jobs.

Optionally, you can also set the store path for storing the models and the triton server configuration via the HYPHA_LAUNCHER_STORE_DIR environment variable. By default, the store path is set to .hypha-launcher.

export HYPHA_LAUNCHER_STORE_DIR=".hypha-launcher"

Download model from s3

python -m hypha-launcher - download_models_from_s3 bioengine-model-runner.* --n_parallel=5

Pull docker image of triton server

python -m hypha-launcher - pull_image

TODO

  • Download model from s3
  • Pull docker image of triton server
  • Run triton server
  • Register service on hypha
  • Conmmunicate with triton server
  • Test on HPC
  • Support run on local machine without GPU
  • Support launch containers inside a container (For support run inside the podman-desktop)
  • Job management(Auto stop and restart)
  • Load balancing
  • Documentation

Development

Install the package in editable mode with the following command:

pip install -e .
pip install -r requirements-dev.txt

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hypha_launcher-0.1.2.tar.gz (15.6 kB view details)

Uploaded Source

Built Distribution

hypha_launcher-0.1.2-py3-none-any.whl (15.1 kB view details)

Uploaded Python 3

File details

Details for the file hypha_launcher-0.1.2.tar.gz.

File metadata

  • Download URL: hypha_launcher-0.1.2.tar.gz
  • Upload date:
  • Size: 15.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for hypha_launcher-0.1.2.tar.gz
Algorithm Hash digest
SHA256 49efb74d6213b78e8ea1fd1da739ee472a05e7beeb7f2f90c124a1924b9af296
MD5 4c34b66b2475ab73e5cef85abb39c4b0
BLAKE2b-256 fe69646bc5db454b8e4b8ff30d3c10d8bdddc91774ed85321d972f9fff46c97f

See more details on using hashes here.

File details

Details for the file hypha_launcher-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for hypha_launcher-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 2ce45dbee0b0b049aa7843ec9dec6940009b9ea7e9248e1dec843b8aa2e7e0e2
MD5 b1be57488cb0a514fab7f626fbd438fe
BLAKE2b-256 74fb44229fd9ccbe9e20dd5c4c7e96d5d5b76f0fba779050727f8b6ee4e138ca

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page