Skip to main content

GlassFlow Clickhouse ETL Python SDK: Create GlassFlow pipelines between Kafka and ClickHouse

Project description

Clickhouse ETL Python SDK


A Python SDK for creating and managing data pipelines between Kafka and ClickHouse.

Features

  • Create and manage data pipelines between Kafka and ClickHouse
  • Deduplication of events during a time window based on a key
  • Temporal joins between topics based on a common key with a given time window
  • Schema validation and configuration management

Installation

pip install glassflow-clickhouse-etl

Quick Start

from glassflow_clickhouse_etl import Pipeline


pipeline_config = {
  "pipeline_id": "test-pipeline",
  "source": {
    "type": "kafka",
    "provider": "aiven",
    "connection_params": {
      "brokers": ["localhoust:9092"],
      "protocol": "SASL_SSL",
      "mechanism": "SCRAM-SHA-256",
      "username": "user",
      "password": "pass"
    }
    "topics": [
      {
        "consumer_group_initial_offset": "earliest",
        "id": "test-topic",
        "name": "test-topic",
        "schema": {
          "type": "json",
          "fields": [
            {"name": "id", "type": "string" },
            {"name": "email", "type": "string"}
          ]
        },
        "deduplication": {
          "id_field": "id",
          "id_field_type": "string",
          "time_window": "1h",
          "enabled": True
        }
      }
    ],
  },
  "sink": {
    "type": "clickhouse",
    "host": "localhost:8443",
    "port": 8443,
    "database": "test",
    "username": "default",
    "password": "pass",
    "table_mapping": [
      {
        "source_id": "test_table",
        "field_name": "id",
        "column_name": "user_id",
        "column_type": "UUID"
      },
      {
        "source_id": "test_table",
        "field_name": "email",
        "column_name": "email",
        "column_type": "String"
      }
    ]
  }
}

# Create a pipeline from a JSON configuration
pipeline = Pipeline(pipeline_config)

# Create the pipeline
pipeline.create()

Pipeline Configuration

For detailed information about the pipeline configuration, see GlassFlow docs.

Tracking

The SDK includes anonymous usage tracking to help improve the product. Tracking is enabled by default but can be disabled in two ways:

  1. Using an environment variable:
export GF_TRACKING_ENABLED=false
  1. Programmatically using the disable_tracking method:
pipeline = Pipeline(pipeline_config)
pipeline.disable_tracking()

The tracking collects anonymous information about:

  • SDK version
  • Platform (operating system)
  • Python version
  • Pipeline ID
  • Whether joins or deduplication are enabled
  • Kafka security protocol, auth mechanism used and whether authentication is disabled
  • Errors during pipeline creation and deletion

Development

Setup

  1. Clone the repository
  2. Create a virtual environment
  3. Install dependencies:
uv venv
source .venv/bin/activate
uv pip install -e .[dev]

Testing

pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

glassflow_clickhouse_etl-0.2.10.tar.gz (74.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

glassflow_clickhouse_etl-0.2.10-py3-none-any.whl (13.5 kB view details)

Uploaded Python 3

File details

Details for the file glassflow_clickhouse_etl-0.2.10.tar.gz.

File metadata

File hashes

Hashes for glassflow_clickhouse_etl-0.2.10.tar.gz
Algorithm Hash digest
SHA256 312acb2253edda0d438bfc555b83290130d8445e45f43ab3f36418d74487dc2d
MD5 5be6c12dcc3b6d12c494d63b8a7d2a79
BLAKE2b-256 3aa554e49663bbe460a13e93dc8e88d56d133ecebdd11b9907e8b6fa9051f563

See more details on using hashes here.

File details

Details for the file glassflow_clickhouse_etl-0.2.10-py3-none-any.whl.

File metadata

File hashes

Hashes for glassflow_clickhouse_etl-0.2.10-py3-none-any.whl
Algorithm Hash digest
SHA256 85a9dad0091e84bbc04a216e6bf6a689c6319327c8bb037bf0fe0ca1ec7d2cb8
MD5 e4d935ca178e73c8ae46531a381e5b03
BLAKE2b-256 5aa5afe4f1e1759fa0ef1ac5f0cfaf7102c9fcae12768f4109eacfae938dd38a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page