Skip to main content

A DAG-based pipeline orchestration framework for building observable, composable stage pipelines

Project description

Stageflow

A DAG-based pipeline orchestration framework for building observable, composable stage pipelines in Python.

Features

  • DAG Execution: Stages execute as soon as dependencies resolve, maximizing parallelism
  • Fluent Pipeline Builder: Type-safe, composable pipeline definitions
  • Interceptor Framework: Middleware pattern for cross-cutting concerns (timeouts, circuit breakers, tracing, metrics)
  • Observable by Design: Structured events, correlation IDs, and logging built-in
  • Protocol-Based Extension: Clean abstractions for persistence, configuration, and events
  • Async-First: Built on asyncio for high-performance concurrent execution
  • Zero Dependencies: Core library has no external dependencies

Installation

Latest release: v1.2.0

pip install stageflow-core

For development:

pip install stageflow-core[dev]

Quick Start

import asyncio
from stageflow.api import (
    Pipeline,
    StageContext,
    StageKind,
    stage,
    stage_metadata,
)

# Define a stage
@stage_metadata(name="greet", kind=StageKind.TRANSFORM)
class GreetStage:
    async def execute(self, ctx: StageContext) -> dict[str, str]:
        name = ctx.snapshot.input_text or "World"
        return {"greeting": f"Hello, {name}!"}

# Define another stage that depends on the first
@stage_metadata(name="shout", kind=StageKind.TRANSFORM)
class ShoutStage:
    async def execute(self, ctx: StageContext) -> dict[str, str]:
        # Access output from dependency via StageInputs
        greeting = ctx.inputs.get_from("greet", "greeting", default="Hello!")
        return {"shouted": greeting.upper()}

# Build the pipeline
pipeline = Pipeline.from_stages(
    stage("greet", GreetStage),
    stage("shout", ShoutStage, after="greet"),
)

# Use `after=` for simple chains like this. Switch to `dependencies=` when a
# stage waits on multiple upstream stages or when you want the full DAG edges
# spelled out explicitly.

# Execute
async def main():
    results = await pipeline.run(
        input_text="World",
        topology="readme_quickstart",
        execution_mode="default",
    )
    print(results.data("shout")["shouted"])  # "HELLO, WORLD!"

asyncio.run(main())

Core Concepts

If you want the smallest practical import surface, start with stageflow.api. Use stageflow.advanced or the root package when you need advanced/runtime internals.

Stages

A Stage is a unit of work with a defined input/output contract:

class Stage(Protocol):
    name: str
    kind: StageKind

    async def execute(self, ctx: StageContext) -> StageOutput: ...

Stage kinds categorize behavior:

  • TRANSFORM - Change input form (STT, TTS, LLM)
  • ENRICH - Add context (profile, memory, skills)
  • ROUTE - Select execution path
  • GUARD - Validate (guardrails, policy)
  • WORK - Side effects (persist, assess)
  • AGENT - Main interaction logic

Pipelines

Pipelines compose stages into a DAG using a fluent builder:

pipeline = (
    Pipeline()
    .with_stage("stt", SttStage, StageKind.TRANSFORM)
    .with_stage("enrich", EnrichStage, StageKind.ENRICH, dependencies=("stt",))
    .with_stage("llm", LlmStage, StageKind.TRANSFORM, dependencies=("enrich",))
    .with_stage("tts", TtsStage, StageKind.TRANSFORM, dependencies=("llm",))
)

Pipelines can be composed:

core_pipeline = Pipeline().with_stage(...)
voice_pipeline = core_pipeline.compose(
    Pipeline().with_stage("tts", TtsStage, StageKind.TRANSFORM, dependencies=("llm",))
)

Interceptors

Interceptors wrap stage execution for cross-cutting concerns:

from stageflow.interceptors import BaseInterceptor, InterceptorResult

class AuthInterceptor(BaseInterceptor):
    name = "auth"
    priority = 5  # Lower = runs first

    async def before(self, stage_name: str, ctx: PipelineContext) -> InterceptorResult | None:
        if not ctx.data.get("authenticated"):
            return InterceptorResult(stage_ran=False, error="Not authenticated")
        return None

    async def after(self, stage_name: str, result: StageResult, ctx: PipelineContext) -> None:
        pass

Built-in interceptors:

  • TimeoutInterceptor - Per-stage timeouts
  • CircuitBreakerInterceptor - Failure isolation
  • TracingInterceptor - OpenTelemetry spans
  • MetricsInterceptor - Stage duration/success metrics
  • LoggingInterceptor - Structured JSON logging

Event Sinks

EventSink is a protocol for event persistence:

from stageflow import EventSink

class MyEventSink(EventSink):
    async def emit(self, *, type: str, data: dict | None) -> None:
        # Persist to your storage
        await db.insert("events", {"type": type, "data": data})

    def try_emit(self, *, type: str, data: dict | None) -> None:
        # Fire-and-forget variant
        asyncio.create_task(self.emit(type=type, data=data))

Architecture

Stageflow follows SOLID principles with a clear separation:

┌─────────────────────────────────────────────────────────────┐
│                        Your Application                     │
├─────────────────────────────────────────────────────────────┤
│  Adapters (implement protocols)                             │
│  - DatabaseEventSink                                        │
│  - PostgresRunStore                                         │
│  - EnvConfigProvider                                        │
├─────────────────────────────────────────────────────────────┤
│                     stageflow (core)                        │
│  ┌─────────┐  ┌─────────┐  ┌─────────────┐  ┌─────────┐     │
│  │ Pipeline│  │  Graph  │  │ Interceptors│  │ Events  │     │
│  └─────────┘  └─────────┘  └─────────────┘  └─────────┘     │
│  ┌─────────────────────────────────────────────────────┐    │
│  │                    Ports (protocols)                │    │
│  │  EventSink | RunStore | ConfigProvider              │    │
│  └─────────────────────────────────────────────────────┘    │
└─────────────────────────────────────────────────────────────┘

Event Taxonomy

Stageflow emits structured events for observability:

Event Type When
pipeline.created Pipeline run initialized
pipeline.started Execution begins
pipeline.completed All stages finished
pipeline.failed Unrecoverable error
pipeline.cancelled Graceful termination
stage.{name}.started Stage execution begins
stage.{name}.completed Stage finished successfully
stage.{name}.failed Stage threw error
stage.{name}.skipped Conditional stage skipped

License

MIT

Contributing

Contributions welcome! Please read the contributing guide first.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

stageflow_core-1.2.0.tar.gz (737.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

stageflow_core-1.2.0-py3-none-any.whl (257.3 kB view details)

Uploaded Python 3

File details

Details for the file stageflow_core-1.2.0.tar.gz.

File metadata

  • Download URL: stageflow_core-1.2.0.tar.gz
  • Upload date:
  • Size: 737.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for stageflow_core-1.2.0.tar.gz
Algorithm Hash digest
SHA256 f40b3d5d07b788faa6989f029622283c7d1809397f9ed4d35327fa1ac259966e
MD5 9e7c80b47eb33193196d41c1e1619a7c
BLAKE2b-256 5f48924312305a2f03b8e91ba19c4050565d588216b43c54c643af18ddc87e2d

See more details on using hashes here.

File details

Details for the file stageflow_core-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: stageflow_core-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 257.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for stageflow_core-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ac388497aceefba359874b2dc55e34e2e64469fb61fbb856679fd052347fa648
MD5 ae7e13026bc478031a675060b3a08bd9
BLAKE2b-256 80d3423a4c30075abbe19746234e506d8728d3ddbd355e6b5c1a8367e2a55661

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page