Skip to main content

A simple Apache Airflow Kettle Operator that can invoke jobs and transformations for Linux based systems.

Project description

Apache Airflow Kettle Operator

PyPI PyPI - Downloads

KettleOperator which consists of KettleRunJobOperator and KettleRunTransformationOperator responsible for running Hitachi Vantara's PDI (Pentaho Data Integration) jobs and transformations from .kjb and .ktr files.

Currently, there's no support for CarteServer and the only way to make it work is to make sure that PDI is deployed within the same container as your Airflow installation and the operator is deployed locally (no repository support either, yet).

Requirements

  • Python 3.7+
  • Apache Airflow 2.0+
  • Hitachi Vantara PDI (within the same container)

Setup

Installation

Installation can be done via pip, and the package itself will be installed in your site-packages directory (just like any other pip installation).

python -m site to find out where are your site-packages installed. This package will be located in the kettle_provider directory.

pip install apache-airflow-providers-kettle

Usage

To use the Operators you first have to import them within your Airflow .py files (presumably your DAG).

from kettle_provider.operators.kettle_operator import KettleRunJobOperator, KettleRunTransformationOperator

The Operators can then be used just like any other.

run_job = KettleRunJobOperator(
    task_id='kettle-run-job',
    file='test-job.kjb',
    params={
        'test-parameter-1': 'test-value-1',
        'test-parameter'2': 'test-value-2',
    },
)
run_transformation = KettleRunTransformationOperator(
    task_id='kettle-run-transformation',
    file='test-job.ktr',
    params={
        'test-parameter-1': 'test-value-1',
        'test-parameter'2': 'test-value-2',
    },
)

Available parameters

Below are the parameters you can use when defining the tasks with their default values. Below list excludes base parameters inherited from BaseOperator class (such as task_id, etc.)

KettleRunJobOperator(
    pdipath: str = '/opt/pentaho/data-integration/'  # PDI installation
    filepath: str = '/opt/pentaho/data-integration/jobs/'  # PDI jobs directory
    file: str | None = None  # .kjb file to run
    logfile: str = '/opt/pentaho/data-integration/logs/pdi.kitchen.log',  # logfile for kitchen runs
    maxloglines: int = 0,  # max log lines for kitchen logfile (0 = no limit)
    maxlogtimeout: int = 0,  # max log age in seconds for kitchen logfile (0 = no limit)
    loglevel: str = 'Basic',  # log level (Basic, Detailed, Debug, Rowlevel, Error, Nothing)
    params: dict[str, str] | None = None,  # dictionary of parameters
    output_encoding: str = 'utf-8',  # output encoding for exit commands
    **kwargs
)
KettleRunTransformationOperator(
    pdipath: str = '/opt/pentaho/data-integration/'  # PDI installation
    filepath: str = '/opt/pentaho/data-integration/transformations/'  # PDI jobs directory
    file: str | None = None  # .ktr file to run
    logfile: str = '/opt/pentaho/data-integration/logs/pdi.pan.log',  # logfile for pan runs
    maxloglines: int = 0,  # max log lines for kitchen logfile (0 = no limit)
    maxlogtimeout: int = 0,  # max log age in seconds for kitchen logfile (0 = no limit)
    loglevel: str = 'Basic',  # log level (Basic, Detailed, Debug, Rowlevel, Error, Nothing)
    params: dict[str, str] | None = None,  # dictionary of parameters
    output_encoding: str = 'utf-8',  # output encoding for exit commands
    **kwargs
)

To-dos

  • CarteServer support (for PDI deployed in a different container)
  • PDI Repository support
  • Better exit codes (currently all we get are bash exit-codes)
  • Support for Windows machines (currently, the commands are all executed with bash)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apache_airflow_providers_kettle-1.0.7.tar.gz (8.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file apache_airflow_providers_kettle-1.0.7.tar.gz.

File metadata

File hashes

Hashes for apache_airflow_providers_kettle-1.0.7.tar.gz
Algorithm Hash digest
SHA256 f6f950345da8c3f8ce941d15fa6985348fa8f04898fc62986638203a8a590784
MD5 a28f05367a7458ca185b4b40306ef5ef
BLAKE2b-256 c81869e27106db0b589e2f99cf34c563d91d295a0d6663887e2353dbaa3e072e

See more details on using hashes here.

File details

Details for the file apache_airflow_providers_kettle-1.0.7-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_providers_kettle-1.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 46027629d5760d36db8b1fcf3ac6f330685a1d38bca29600447dfa6eafd8f572
MD5 60c37e57271b9c7157d12e616112aca3
BLAKE2b-256 8870720db3b09dfd1da6b5ecf90dab465afcb449dab8fc999c60f598a4f1034c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page