Skip to main content

airflow-clickhouse-plugin - Airflow plugin to execute ClickHouse commands and queries

Project description

Airflow ClickHouse Plugin

Provides ClickHouseHook and ClickHouseOperator for Apache Airflow based on mymarilyn/clickhouse-driver.

Features

  1. SQL queries are templated.
  2. Can run multiple SQL queries per single ClickHouseOperator.
  3. Result of the last query of ClickHouseOperator instance is pushed to XCom.
  4. Executed queries are logged in a pretty form.
  5. Uses effective native ClickHouse TCP protocol thanks to clickhouse-driver. Does not support HTTP protocol.
  6. Supports extra ClickHouse connection parameters such as various timeouts, compression, secure, etc through Airflow Connection.extra property.

Installation and dependencies

pip install -U airflow-clickhouse-plugin

Requires apache-airflow and clickhouse-driver. Primarily supports Airflow 1.10.6 since it is the latest version supported by Google Cloud Composer: later versions are expected to work properly but may be not fully tested.

Usage

See examples below.

ClickHouseOperator Reference

To import ClickHouseOperator use: from airflow.operators.clickhouse_operator import ClickHouseOperator

Supported kwargs:

  • sql: templated query (if argument is a single str) or queries (if iterable of str's).
  • clickhouse_conn_id: connection id. Connection schema is described below.
  • parameters: passed to clickhouse-driver execute method.
    • If multiple queries are provided via sql then the parameters are passed to all of them.
    • Parameters are not templated.
  • database: if present, overrides database defined by connection.
  • Other kwargs (including the required task_id) are inherited from Airflow BaseOperator.

The result of the last query is pushed to XCom.

ClickHouseHook Reference

To import ClickHouseHook use: from airflow.hooks.clickhouse_hook import ClickHouseHook

Supported kwargs of constructor (__init__ method):

  • clickhouse_conn_id: connection id. Connection schema is described below.
  • database: if present, overrides database defined by connection.

Supports all of the methods of the Airflow BaseHook including:

  • get_records(sql: str, parameters: dict=None): returns result of the query as a list of tuples. Materializes all the records in memory.
  • get_first(sql: str, parameters: dict=None): returns the first row of the result. Does not load the whole dataset into memory because of using execute_iter. If the dataset is empty then returns None following fetchone semantics.
  • run(sql, parameters): runs a single query (specified argument of type str) or multiple queries (if iterable of str). parameters can have any form supported by execute method of clickhouse-driver.
    • If single query is run then returns its result. If multiple queries are run then returns the result of the last of them.
    • If multiple queries are given then parameters are passed to all of them.
    • Materializes all the records in memory (uses simple execute but not execute_iter).
      • To achieve results streaming by execute_iter use it directly via hook.get_conn().execute_iter(…) (see execute_iter reference).
    • Every run call uses a new connection which is closed when finished.
  • get_conn(): returns the underlying clickhouse_driver.Client instance.

ClickHouse Connection schema

clickhouse_driver.Client is initiated with attributes stored in Airflow Connection attributes. The mapping of the attributes is listed below:

Airflow Connection attribute Client.__init__ argument
host host
port port
schema database
login user
password password

If you pass database argument to ClickHouseOperator or ClickHouseHook explicitly then it is passed to the Client instead of the schema attribute of the Airflow connection.

Extra arguments

You may also pass additional arguments, such as timeouts, compression, secure, etc through Connection.extra attribute. The attribute should contain a JSON object which will be deserialized and all of its properties will be passed as-is to the Client.

For example, if Airflow connection contains extra={"secure":true} then the Client.__init__ will receive secure=True keyword argument in addition to other non-empty connection attributes.

Default values

If the Airflow connection attribute is not set then it is not passed to the Client at all. In that case the default value of the corresponding clickhouse_driver.Connection argument is used (e.g. user defaults to 'default').

This means that Airflow ClickHouse Plugin does not itself define any default values for the ClickHouse connection. You may fully rely on default values of the clickhouse-driver version you use. The only exception is host: if the attribute of Airflow connection is not set then 'localhost' is used.

Examples

ClickHouseOperator

from airflow import DAG
from airflow.operators.clickhouse_operator import ClickHouseOperator
from airflow.operators.python_operator import PythonOperator
from airflow.utils.dates import days_ago

with DAG(
        dag_id='update_income_aggregate',
        start_date=days_ago(2),
) as dag:
    ClickHouseOperator(
        task_id='update_income_aggregate',
        database='default',
        sql=(
            "INSERT INTO aggregate "
                "SELECT eventDt, sum(price * qty) AS income FROM sales "
                "WHERE eventDt = '{{ ds }}' GROUP BY eventDt",
            "OPTIMIZE TABLE aggregate ON CLUSTER {{ var.value.cluster_name }} "
                "PARTITION toDate('{{ execution_date.format('%Y-%m-01') }}')",
            "SELECT sum(income) FROM aggregate "
                "WHERE eventDt BETWEEN "
                "'{{ execution_date.start_of('month').to_date_string() }}'"
                "AND '{{ execution_date.end_of('month').to_date_string() }}'",
            # result of the last query is pushed to XCom
        ),
        clickhouse_conn_id='clickhouse_test',
    ) >> PythonOperator(
        task_id='print_month_income',
        provide_context=True,
        python_callable=lambda task_instance, **_:
            # pulling XCom value and printing it
            print(task_instance.xcom_pull(task_ids='update_income_aggregate')),
    )

ClickHouseHook

from airflow import DAG
from airflow.hooks.clickhouse_hook import ClickHouseHook
from airflow.hooks.mysql_hook import MySqlHook
from airflow.operators.python_operator import PythonOperator
from airflow.utils.dates import days_ago


def mysql_to_clickhouse():
    mysql_hook = MySqlHook()
    ch_hook = ClickHouseHook()
    records = mysql_hook.get_records('SELECT * FROM some_mysql_table')
    ch_hook.run('INSERT INTO some_ch_table VALUES', records)


with DAG(
        dag_id='mysql_to_clickhouse',
        start_date=days_ago(2),
) as dag:
    dag >> PythonOperator(
        task_id='mysql_to_clickhouse',
        python_callable=mysql_to_clickhouse,
    )

Important note: don't try to insert values using ch_hook.run('INSERT INTO some_ch_table VALUES (1)') literal form. clickhouse-driver requires values for INSERT query to be provided via parameters due to specifics of the native ClickHouse protocol.

Default connection

By default the hook and operator use connection_id='clickhouse_default'.

How to run tests

Unit tests

From the root project directory: python -m unittest discover -s tests/unit

Integration tests

Integration tests require an access to ClickHouse server. Tests use connection URI defined via environment variable AIRFLOW_CONN_CLICKHOUSE_DEFAULT with clickhouse://localhost as default.

Run from the project root: python -m unittest discover -s tests/integration

All tests

From the root project directory: python -m unittest discover -s tests

How to upload to PyPI

info: https://packaging.python.org/tutorials/packaging-projects/#uploading-your-project-to-pypi python3 setup.py sdist bdist_wheel twine upload dist/*

Test PyPI: twine upload --repository testpypi dist/* username: token token: pypi-AgENdGVzdC5weXBpLm9yZwIkMGY5MWM2ZDQtZmNiMy00NTQwLWIxOTctOTZkOWQzYTdhYWI2AAJKeyJwZXJtaXNzaW9ucyI6IHsicHJvamVjdHMiOiBbImFpcmZsb3ctY2xpY2tob3VzZS1wbHVnaW4iXX0sICJ2ZXJzaW9uIjogMX0AAAYg-zWk6Jfr3I2ytGp93jHxuUzQCMIgqOdlqKY9S3RbPkk

Contributors

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow-clickhouse-plugin-0.5.7.tar.gz (14.1 kB view details)

Uploaded Source

Built Distribution

airflow_clickhouse_plugin-0.5.7-py3-none-any.whl (16.9 kB view details)

Uploaded Python 3

File details

Details for the file airflow-clickhouse-plugin-0.5.7.tar.gz.

File metadata

  • Download URL: airflow-clickhouse-plugin-0.5.7.tar.gz
  • Upload date:
  • Size: 14.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.44.0 CPython/3.7.7

File hashes

Hashes for airflow-clickhouse-plugin-0.5.7.tar.gz
Algorithm Hash digest
SHA256 3cab754294cf942d04b8e7a253d61c2df0a065641b1e1ab88698b7c628fe27a7
MD5 580b9e1a05a9115b03a46432bea96176
BLAKE2b-256 336b457ba59a0586d3ace18c9314dd76dae84bd883e2a36c31ac4f4f92894454

See more details on using hashes here.

File details

Details for the file airflow_clickhouse_plugin-0.5.7-py3-none-any.whl.

File metadata

  • Download URL: airflow_clickhouse_plugin-0.5.7-py3-none-any.whl
  • Upload date:
  • Size: 16.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.44.0 CPython/3.7.7

File hashes

Hashes for airflow_clickhouse_plugin-0.5.7-py3-none-any.whl
Algorithm Hash digest
SHA256 033a40e8288b9142c14cfc88964d92d7244c7cea00054ed12ce5451ce39fd6e2
MD5 0c1af012c56ebb0a029303ddaed7cb3c
BLAKE2b-256 bc49d8fb90c3d06d4bff97712f076611315509ac40f52c0d5c28f40d88bd056c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page