Skip to main content

Run dbt Python models locally and materialize results back to Postgres

Project description

dbt-pybridge

dbt-pybridge is a dbt adapter that enables Python models in a normal dbt run against Postgres.

It works by:

  • compiling .py models through dbt
  • executing Python locally (developer laptop or CI runner)
  • loading dbt.ref() / dbt.source() data into pandas/polars
  • writing the returned dataframe back into Postgres

Status

MVP scope for Python table + incremental + view materializations is implemented.

  • Supported: materialized='table'
  • Supported: materialized='incremental' (strategies: append, merge, delete+insert)
  • Supported: materialized='view' (implemented as a managed backing table + SQL view)
  • Supported DAG: sql -> python -> sql
  • Supported return types: pandas DataFrame, polars DataFrame, or iterable/generator of dataframes

Install

pip install -e .

Use a supported Python version (3.11/3.12 recommended).

Profile

Set your profile type to pybridge:

my_profile:
  target: dev
  outputs:
    dev:
      type: pybridge
      host: localhost
      user: postgres
      password: postgres
      port: 5432
      dbname: analytics
      schema: public
      threads: 1

Example model

def model(dbt, session):
    df = dbt.ref("stg_orders")
    df["double_amount"] = df["amount"] * 2
    return df

How to create Python models

  1. Create models/<name>_python.py.
  2. Define exactly one callable entrypoint: def model(dbt, session): ....
  3. Set materialization inside the function:
    • dbt.config(materialized="table")
  4. Read upstream inputs using standalone ref/source assignments (important for dbt parser):
    • orders = dbt.ref("stg_orders")
    • raw_orders = dbt.source("raw", "orders")
  5. Return one of:
    • pandas DataFrame
    • polars DataFrame
    • iterable/generator that yields pandas/polars DataFrames

Parser-safe pattern:

def model(dbt, session):
    dbt.config(materialized="table")
    orders = dbt.ref("stg_orders")
    result = orders.copy()
    result["double_amount"] = result["amount"] * 2
    return result

Chunked mode:

def model(dbt, session):
    for batch in dbt.ref("stg_orders").iter_batches(batch_size=100_000):
        yield transform(batch)

Runtime configs

Set model-level configs via dbt.config(...) in your python model:

  • localpy_dataframe_backend: pandas (default) or polars
  • localpy_max_rows: hard limit before failure (default 1_000_000)
  • localpy_warn_rows: warning threshold (default 200_000)
  • localpy_max_bytes: hard estimated table-size limit before failure (default 536870912, 512MB)
  • localpy_warn_bytes: warning estimated table-size threshold (default 134217728, 128MB)
  • localpy_allow_large_tables: bypass hard row limit (default false)
  • localpy_chunked_mode: allow oversized input only when using iter_batches (default false)
  • localpy_batch_size: default batch size for iter_batches (default 100_000)
  • localpy_column_types: optional explicit type map for created target tables, for example:
    • {"id": "numeric(18,0)", "created_at": "timestamp", "payload": "jsonb"}
  • localpy_categorical_types: optional categorical-column enum type map, for example:
    • {"status": "status_enum", "tier": "tier_enum"}

Type inference details

Default inferred target types now include:

  • Numeric widths:
    • smallint / integer / bigint / numeric (for wide unsigned integers)
    • real / double precision
  • Temporal:
    • date, time, timetz, timestamp, timestamptz, interval
  • Structured / special:
    • uuid, bytea, jsonb
  • Arrays (homogeneous scalar list/tuple object columns):
    • boolean[], bigint[], double precision[], text[], uuid[], date[], time[], timetz[], timestamp[], timestamptz[], numeric[]
    • mixed or nested list structures fall back to jsonb

Notes:

  • Decimal object columns infer numeric(precision,scale) from sampled values.
  • Empty or ambiguous object columns fall back to text (or jsonb for ambiguous list structures).
  • You can always override with localpy_column_types.

Honest limitations

  • Not Snowpark
  • Not Spark
  • Python runs on local machine / CI runner
  • Not intended for huge tables
  • Best for small/medium transforms

First milestone command

dbt run -s customer_features

More examples

The examples/mvp_project/ directory has runnable models for each major feature:

  • customer_features.py — minimal pandas table model
  • orders_polars.py — polars backend (localpy_dataframe_backend='polars')
  • daily_revenue_incremental.py — incremental + merge strategy with unique_key
  • orders_with_jsonb.pylocalpy_column_types overrides for jsonb, text[], and numeric(18,4)
cd examples/mvp_project
dbt run -s orders_polars
dbt run -s daily_revenue_incremental
dbt run -s daily_revenue_incremental                # second run exercises merge
dbt run -s daily_revenue_incremental --full-refresh # rebuild from scratch
dbt run -s orders_with_jsonb

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dbt_pybridge-0.1.0.tar.gz (27.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dbt_pybridge-0.1.0-py3-none-any.whl (22.4 kB view details)

Uploaded Python 3

File details

Details for the file dbt_pybridge-0.1.0.tar.gz.

File metadata

  • Download URL: dbt_pybridge-0.1.0.tar.gz
  • Upload date:
  • Size: 27.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for dbt_pybridge-0.1.0.tar.gz
Algorithm Hash digest
SHA256 959dfcba46afe4ea176dd5785e3c731ab29faf658f45c5918e15879f7116950e
MD5 c56bb7faa84b9b3b384b109d43289c12
BLAKE2b-256 3113cc67ee504632db2416d1c037b9aaa653f28c7019d3b3bc30b02bf8015ead

See more details on using hashes here.

Provenance

The following attestation bundles were made for dbt_pybridge-0.1.0.tar.gz:

Publisher: release.yml on kraftaa/dbt-pybridge

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file dbt_pybridge-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: dbt_pybridge-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 22.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for dbt_pybridge-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 959195502c497cabda6ae5fea298d38a136966c48cc5255be964692a03e5d3b0
MD5 a5b90bcf6c13e757f7dbd851b93a2e7a
BLAKE2b-256 0997e29c1cbcf59af78bea313c50a4e384c075bea231251a4445022a8a9ef3a5

See more details on using hashes here.

Provenance

The following attestation bundles were made for dbt_pybridge-0.1.0-py3-none-any.whl:

Publisher: release.yml on kraftaa/dbt-pybridge

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page