Skip to main content

Standardizing models

Project description

bollhav

Model definition framework for data pipeline targets.

Implementations

Postgres
Parquet


Installation

pip install bollhav

Model

from bollhav import Model, ModelType, WriteMode, Database, PostgresColumn, PostgresType

model = Model(
    name="orders",
    source_entity="raw.orders",
    table="orders",
    schema="public",
    database=Database.POSTGRES,
    columns=[
        PostgresColumn(name="id", data_type=PostgresType.BIGINT, primary_key=True, nullable=False, order=0),
        PostgresColumn(name="created_at", data_type=PostgresType.TIMESTAMPTZ, nullable=False, order=1),
        PostgresColumn(name="email", data_type=PostgresType.TEXT, nullable=True, order=2, sensitive=True),
    ],
    write_mode=WriteMode.APPEND,
    cron="0 3 * * *",
    partitioned_by=["created_at"],
)

Parameters

Parameter Type Default Description
name str required Unique identifier for the model
source_entity str required Source table or view to read from
table str "" Destination table name
schema str "" Destination schema name
database Database None Target database. Required if columns is set
columns list[PostgresColumn | ParquetColumn] None Column definitions. Required if database is set
model_type ModelType TABLE TABLE or VIEW
write_mode WriteMode APPEND How to write data. VIEW requires ModelType.VIEW
tags list[str] None Labels for filtering
cron str None Cron expression. Automatically infers batch_size
enabled bool True Whether the model is active
debug bool False Enables debug mode
description str None Human-readable description
source_dsn str None DSN for the source connection
source_query str None Optional query to use instead of source_entity
partitioned_by list[str] None Column names to partition by. Must exist in columns
**kwargs Extra metadata. Callable values are resolved with non-callable kwargs as arguments

Computed attributes

Attribute Description
batch_size Inferred from cron if set, otherwise None
sensitive True if any column has sensitive=True

Databases

from bollhav import Database

Database.POSTGRES
Database.PARQUET

Write modes

from bollhav import WriteMode

WriteMode.APPEND
WriteMode.VIEW     # Must be used with ModelType.VIEW

Extra kwargs

Non-reserved keyword arguments are stored in model.extra. Callable values are resolved at init time using the non-callable kwargs as arguments.

model = Model(
    name="orders",
    source_entity="raw.orders",
    static="production",
    env=lambda static: f"env={static}",
)

model.extra  # {"static": "production", "env": "env=production"}

Project details


Release history Release notifications | RSS feed

This version

1.2.1

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bollhav-1.2.1.tar.gz (9.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bollhav-1.2.1-py3-none-any.whl (8.4 kB view details)

Uploaded Python 3

File details

Details for the file bollhav-1.2.1.tar.gz.

File metadata

  • Download URL: bollhav-1.2.1.tar.gz
  • Upload date:
  • Size: 9.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for bollhav-1.2.1.tar.gz
Algorithm Hash digest
SHA256 ce06c782a3146431bc1a5a695284e7800da5cbac448ce7aa90f38765cd4d0e44
MD5 988dde6ab1362347c3da722ed8f11f93
BLAKE2b-256 97b928a1859ca4fd7e1abbe286f7155bd8ea108fa66e67b91b1ffc0017021309

See more details on using hashes here.

File details

Details for the file bollhav-1.2.1-py3-none-any.whl.

File metadata

  • Download URL: bollhav-1.2.1-py3-none-any.whl
  • Upload date:
  • Size: 8.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for bollhav-1.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 89e2254b7410323ef71f6f94d4f68de0239db75b1867b9da4c4102f0beb41d49
MD5 e749cfe7ae3b38463ad5721ae16872da
BLAKE2b-256 b1ec286887e44a61f8e44ef320e8529d5337a7baea64df497b090c814faa9402

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page