Skip to main content

SDK for authoring Improvado custom pipeline workflows on Temporal

Project description

improvado-pipeline-sdk

SDK for authoring custom data pipeline workflows on Temporal.

Provides type definitions, helper functions, and constants for building ETL/ELT pipelines that run on Improvado's Temporal infrastructure.

Install

pip install improvado-pipeline-sdk

Usage

Workflow code

import dataclasses
from temporalio import workflow

with workflow.unsafe.imports_passed_through():
    from pipeline_sdk.types import (
        DataRef,
        DateRangeRequest,
        DateRangeType,
    )
    from pipeline_sdk.activities import (
        les_extract,
        ch_load,
        prepare_credentials,
        cleanup_credentials,
        CUSTOM_ACTIVITY_QUEUE,
    )


@dataclasses.dataclass(frozen=True)
class MyPipelineParams:
    connector_id: int


@workflow.defn(sandboxed=False, name="MyPipeline")
class MyPipelineWorkflow:
    @workflow.run
    async def run(self, params: MyPipelineParams) -> dict:
        result = await les_extract(
            connector_id=params.connector_id,
            data_source="facebook",
            report_type="ad_insights",
            date_range=DateRangeRequest(
                date_range_type=DateRangeType.MANUAL,
                params={"date_from": "2026-01-01", "date_to": "2026-03-31"},
            ),
        )
        return {"data_ref": result.data_ref.uri}

Activity code

from temporalio import activity
from pipeline_sdk.runtime import (
    read_pipeline_secret,
    get_current_tenant,
    PipelineState,
)


@activity.defn
async def my_activity(creds) -> dict:
    tenant = get_current_tenant()
    secret = await read_pipeline_secret(creds)
    # secret.s3, secret.storage, secret.connections
    return {"tenant": tenant.agency_uuid}

Worker / client setup

from pipeline_sdk.tenant import (
    TenantInterceptor,       # worker side
    TenantClientInterceptor, # client side
    TenantID,
)

Type checking

pyright my_workflow.py
ruff check my_workflow.py
ruff format my_workflow.py

Both pyright and ruff are included as dependencies.

Package structure

Module Purpose
pipeline_sdk.types Pure, Temporal-serializable data types and enums — DataRef, TenantID, Cluster, DateRangeRequest, DateRangeType, PipelineCredentials, PipelineSecret, S3Credentials, StorageCredentials, LesActivityWithS3Result, PipelineLoadResult. Safe to import from either workflow or activity code.
pipeline_sdk.activities Predefined workflow-side wrappers — les_extract, ch_load, prepare_credentials, cleanup_credentials, plus CUSTOM_ACTIVITY_QUEUE. Import inside workflow.unsafe.imports_passed_through().
pipeline_sdk.runtime Activity-side helpers — read_pipeline_secret, PipelineState, get_current_tenant, plus VAULT_ADDR / VAULT_MOUNT_POINT for Vault config overrides.
pipeline_sdk.tenant Tenant propagation infrastructure — TenantInterceptor, TenantClientInterceptor, build_tenant_headers, plus TenantID / get_current_tenant re-exports.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

improvado_pipeline_sdk-0.2.0.tar.gz (13.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

improvado_pipeline_sdk-0.2.0-py3-none-any.whl (17.9 kB view details)

Uploaded Python 3

File details

Details for the file improvado_pipeline_sdk-0.2.0.tar.gz.

File metadata

  • Download URL: improvado_pipeline_sdk-0.2.0.tar.gz
  • Upload date:
  • Size: 13.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.13

File hashes

Hashes for improvado_pipeline_sdk-0.2.0.tar.gz
Algorithm Hash digest
SHA256 a141ed4548efe7f19558a9836316448140662a7f984758a049d5076c2756d8c2
MD5 feb1d9651cd9cfc28f3995f9d4598776
BLAKE2b-256 79f85af3ae6d87e01d7882cc9ee7fcac5091dd7a8b8194bd7c6556c2ed9c8176

See more details on using hashes here.

File details

Details for the file improvado_pipeline_sdk-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for improvado_pipeline_sdk-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0572da972d2ae7b75c20c7f7421968184edbeae4061849a830325df55ab07a08
MD5 537aa769512e75aa1839b8826ec8fab0
BLAKE2b-256 425c7340192e2f85b365a35dae2c7a68459af03291cd6cd515c5b2cb035b9d35

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page