Skip to main content

Python SDK for Cascade workflow orchestration

Project description

Cascade SDK

PyPI version

The Cascade SDK is a standards-first Python library for workflow orchestration. It acts as a "Babel for Workflows," allowing you to define logic in Python and execute it across diverse orchestrators while maintaining strict compliance with industry standards like W3C PROV and CloudEvents.

Quick Start

Installation

pip install noirstack-cascade-sdk

1. Define Your Flow (Capture Mode)

Cascade uses Capture Mode to record workflow structures without executing them locally. Task calls are intercepted to build a serializable DAG.

In 0.1.x, data exchanged between tasks should be JSON-serializable primitives and containers (str, int, float, bool, dict, list, None). Avoid passing live runtime objects such as DB connections, open file handles, or custom class instances across task boundaries.

from cascade_sdk import task, flow

@task
def extract_metadata(file_path: str):
    return {"status": "processing", "path": file_path}

@flow
def ingestion_flow(path: str):
    return extract_metadata(path)

2. Register and Execute

from cascade_sdk import CascadeClient, wait_for_completion
from cascade_sdk.compiler import build_dag_from_flow

# Compile to a deterministic DAG
dag = build_dag_from_flow(ingestion_flow)

# Initialize the thin client
client = CascadeClient(base_url="http://localhost:3000", api_key="your_key")

# Register and trigger
flow_id = client.register_flow("data_pipeline", dag)
run_id = client.trigger_flow(flow_id, {"path": "/data/source.csv"})

# Wait for result
result = wait_for_completion(client, run_id)
print(f"Workflow Result: {result['result']}")

Standards and Compliance

Cascade is designed for regulated environments (financial, healthcare, federal) where auditability is non-negotiable.

  • W3C PROV: Generate structured lineage describing agents, activities, and entities.
  • CloudEvents: Interoperable event envelopes for system-wide triggers.
  • NIST SP 800-204: Security guidance profile for microservice boundaries.
  • OpenTelemetry: Native distributed tracing context propagation.

Human-in-the-Loop (HITL) Provenance

Capture manual interventions with the same rigor as automated tasks:

from cascade_sdk import build_prov_bundle

def log_approval(manager_email, run_id):
    # Generates a W3C-compliant audit trail for a manual decision
    return build_prov_bundle(
        agent={f"agent:{manager_email}": {"prov:type": "prov:Person"}},
        activity={"activity:approval": {"prov:type": "cascade:human_intervention"}},
        wasAssociatedWith={"activity:approval": f"agent:{manager_email}"}
    )

Note: build_prov_bundle() emits a minimal PROV-JSON document with the standard prov prefix. Keep prefixed keys consistent (for example prov:type) when adding domain-specific attributes.

Ecosystem Adapters

Migrate legacy workloads to Cascade without rewriting your logic. 0.1.0 supports:

  • Airflow: airflow_dag_to_dag(dag)
  • Argo: argo_workflow_to_dag(dict)
  • BPMN 2.0: bpmn_xml_to_dag(xml_str)
  • Others: Support for Kestra, Dagster, Mage, and more.

0.1.0 Considerations (Beta Status)

As an early public release, please note the following:

Known Limitation (Current)

  • The SDK currently provides synchronous polling via wait_for_completion(...); async-native polling helpers are planned for 0.2.0.

  • Synchronous Polling: wait_for_completion is currently blocking. Async support is planned for 0.2.0.

  • Serialization: All data passed between tasks must be JSON-serializable.

  • Thin Client: The SDK contains zero orchestration logic (no retries/caching); these are handled by the Cascade control plane.

  • Optional Extras: Install specific adapters using extras: pip install "noirstack-cascade-sdk[airflow,standards]".

  • HITL Resume Path: 0.1.x does not yet expose a first-class submit_task_output(...) helper. Resume/approval handoff is currently control-plane API specific.

Contributing

We are actively seeking feedback on the DAG Compiler and standards integrations.

Want to influence 0.2.0? If you encounter compiler edge cases or unclear DAG build errors on valid Python constructs, open an issue with a minimal code snippet and expected DAG behavior.

Created by Noir Stack LLC.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

noirstack_cascade_sdk-0.1.2.tar.gz (25.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

noirstack_cascade_sdk-0.1.2-py3-none-any.whl (26.9 kB view details)

Uploaded Python 3

File details

Details for the file noirstack_cascade_sdk-0.1.2.tar.gz.

File metadata

  • Download URL: noirstack_cascade_sdk-0.1.2.tar.gz
  • Upload date:
  • Size: 25.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for noirstack_cascade_sdk-0.1.2.tar.gz
Algorithm Hash digest
SHA256 f3def3629fb7c93163f44320857e3a34a0fc3971d774ec7425d9e874a8ac0729
MD5 f89e42a13bef96328daa172e4f058c27
BLAKE2b-256 bdd543a9fb5e0153e2752b04a617ffe2417fc343651c96dbb8e1647dc9e37939

See more details on using hashes here.

File details

Details for the file noirstack_cascade_sdk-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for noirstack_cascade_sdk-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 aa9703bc3fd35698ffcd64f8d9df2b0452d348d9d4f1ca774a22ad9dd04e8ffe
MD5 16e4aa0a4585af1c2d7ecc07233b18c4
BLAKE2b-256 eedb876cf1676ec93180fc4ba3ee5228e297214c4a1a39a1847d46caa06b2fcb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page