Skip to main content

Library for model training in multi-cloud environment.

Project description

cascade

Cascade is a library for submitting and managing jobs across multiple cloud environments. It is designed to integrate seamlessly into existing Prefect workflows or can be used as a standalone library.

Getting Started

Installation

poetry add block-cascade

or

pip install block-cascade

Example Usage

from block_cascade import remote
from block_cascade import GcpEnvironmentConfig, GcpMachineConfig, GcpResource

machine_config = GcpMachineConfig("n2-standard-4", 1)
environment_config = GcpEnvironmentConfig(
    project="example-project",
    region="us-west1",
    service_account=f"example-project@vertex.iam.gserviceaccount.com",
    image="us.gcr.io/example-project/cascade/cascade-test",
    network="projects/123456789123/global/networks/shared-vpc"
)
gcp_resource = GcpResource(
    chief=machine_config,
    environment=environment_config,
)

@remote(resource=gcp_resource)
def addition(a: int, b: int) -> int:
    return a + b

result = addition(1, 2)
assert result == 3

Configuration

Cascade supports defining different resource requirements via a configuration file titled either cascade.yaml or cascade.yml. This configuration file must be located in the working directory of the code execution to be discovered at runtime.

calculate:
  type: GcpResource
  chief:
    type: n1-standard-1
You can even define a default configuration that can be overridden by specific tasks to eliminate redundant definitions.

default:
    GcpResource:
        environment:
            project: example-project
            service_account: example-project@vertex.iam.gserviceaccount.com
            region: us-central-1
        chief:
            type: n1-standard-4

Authorization

Cascade requires authorization both to submit jobs to either GCP or Databricks and to stage picklied code to a cloud storage bucket. In the GCP example below, an authorization token is obtained via IAM by running the following command:

gcloud auth login --update-adc

No additional configuration is required in your application's code to use this token.

However, for authenticating to Databricks and AWS you will need to provide a token and secret key respectively. These can be passed directly to the DatabricksResource object or set as environment variables. The following example shows how to provide these values in the configuration file.

Persistent Resources in GCP

Cascade supports creating persistent resources in GCP. These resources can be reused across multiple tasks and will persist until deleted manually by the user. This can be useful for debugging tasks that involve large images that take a long time to be loaded onto a node or for reserving scarce resources like A100 GPUs.

You can create a persistent resource using the cascade CLI and suppling a cascade.yml with a configuration block that contains a persistent_resource_id field. This field will be used to identify the persistent resource when submitting tasks to it. It is recommended that you use the configuration file to define the resource as well as the tasks that will be submitted to it. This will ensure that the resource specified for your task is compatible with the shape of the persistent resource.

persistent-resource:
  type: GcpResource
  environment:
      project: example-project
      service_account: example-project@example-project.iam.gserviceaccount.com
      region: us-west1
      image: us.gcr.io/example-project/cascade/block-cascade
  chief:
      type: n1-standard-4
  persistent_resource_id: my-persistent-resource

create the persistent resource

cascade create-persistent-resource --config persistent-resource

You can then submit cascade tasks to this persistent resource

from block_cascade import remote


@remote(config_name="persistent-resource", job_name="hello-world")
def test_job():
    print("Hello World")


test_job()

Don't forget to delete the persistent resource when you are done with it

cascade delete-persistent-resource -i my-persistent-resource

Note: persistent resource ids can not be reused. If you delete a persistent resource, you will need to create a new one with a different id.

For Developers

Using hermit for managing Python

When developing cascade, you can optionally use hermit to manage the Python executable used by cascade. Together with using poetry to manage dependencies, this will ensure that your development environment is identical to other contributors. Follow the linked instructions for installing hermit and then you can create a virtualenv with Python@3.9 by running:

. ./bin/activate-hermit

Then, install the dependencies with poetry: poetry install

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

block_cascade-2.6.0.tar.gz (38.4 kB view details)

Uploaded Source

Built Distribution

block_cascade-2.6.0-py3-none-any.whl (50.4 kB view details)

Uploaded Python 3

File details

Details for the file block_cascade-2.6.0.tar.gz.

File metadata

  • Download URL: block_cascade-2.6.0.tar.gz
  • Upload date:
  • Size: 38.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for block_cascade-2.6.0.tar.gz
Algorithm Hash digest
SHA256 2b6d3c019147f431d681e049038f3e784f97e344f50bf83a2083d2e83a0209ba
MD5 0f2b526926985c96b6af9fa8151714a8
BLAKE2b-256 677992d7323ca8af6a23d01c9138195528cbbbcf80c5d35ffb343067be9b700a

See more details on using hashes here.

Provenance

The following attestation bundles were made for block_cascade-2.6.0.tar.gz:

Publisher: python-publish.yml on square/cascade

Attestations:

File details

Details for the file block_cascade-2.6.0-py3-none-any.whl.

File metadata

File hashes

Hashes for block_cascade-2.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2f778226efd226897676f1cf7fd7e7cc0f8ef81ace0b9516858017975623a06d
MD5 d649544af55ec06eee29b991de8c0beb
BLAKE2b-256 b42f40005cccaac310cb47c55d7b91b6497d50b06d3ead9142968e4e3ca40887

See more details on using hashes here.

Provenance

The following attestation bundles were made for block_cascade-2.6.0-py3-none-any.whl:

Publisher: python-publish.yml on square/cascade

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page