Skip to main content

A package to simplify connecting to the TM1 REST API from Apache Airflow

Project description

airflow-provider-tm1

CI Build

A package by Knowledgeseed and Cubewise that provides a hook to simplify the connection to the IBM Cognos TM1 / Planning Analytics REST API.

https://github.com/airflow-provider-tm1/airflow-provider-tm1

This repository builds on https://github.com/MariusWirtz/airflow-tm1 and https://github.com/scrambldchannel/airflow-tm1 which offer only Airflow 1.x compatibility, and upgrades the provider to ensure Airflow 2.x compatibility. Some parts and have been reused from https://github.com/scrambldchannel/airflow-provider-tm1 as well.

Requirements

  • Python 3.7+
  • Airflow 2.3+
  • TM1py 2.0+

Development

python -m venv .env
source .env/bin/activate
python -m pip install -r requirements.txt
python -m build

Installation

Install with pip pip install airflow-provider-tm1

Usage

Create a connection in Airflow with at least the following parameters set:

  • Host
  • Login
  • Password
  • Port
  • Extras
    • ssl

Any other parameter accepted by the TM1py RestService constructor (eg base_url, namespace etc) can also be added as a key in the Extras field in the connection.

airflow_tm1_conn

In your DAG file:

from airflow_provider_tm1.operators.tm1_run_ti import TM1RunTIOperator

...

t1 = TM1RunTIOperator (
        task_id='t1',
        tm1_conn_id='tm1_conn',
        process_name='airflow_test_params_success_dag',
        tm1_params={'testParam1': 'testParamValue'},
        timeout=20,
        cancel_at_timeout=True
    )

This will attempt to connect to the TM1 server using the details provided and initialise an instance of the TM1Service class than be accessed at airflow_provider_tm1.hooks.tm1.TM1Hook

See TM1py for more details.

It's important to mention that TM1Py will execute TI process in asynchronous mode. The operator submits the request, receives the async_id from TM1 and starts polling the result using async_id until the response is retrieved or times out. Timeout is defined in seconds, default value is 300 seconds. It's also important that Airflow-side timeout does not involve automatically the cancellation of the TI process. If cancel_at_timeout is set True, Airflow will make an attempt to cancel the long-running TI process.

For further examples, please see tests_integration/dags folder.

Manual integration testing

Use tests_integration/docker-compose.yaml as a baseline, which spins up an Airflow including the TM1 provider and a base TM1 database to test against. Please note that tm1-docker image is properietary IBM product wrapped in Docker by Knowledgeseed and therefore it is only available internally for Knowledgeseed developers.

To obtain a licensed IBM TM1 database for testing or production purpose, please see https://www.ibm.com/topics/tm1 for further details.

License

See LICENSE

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow_provider_tm1-0.2.0.tar.gz (17.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

airflow_provider_tm1-0.2.0-py3-none-any.whl (16.2 kB view details)

Uploaded Python 3

File details

Details for the file airflow_provider_tm1-0.2.0.tar.gz.

File metadata

  • Download URL: airflow_provider_tm1-0.2.0.tar.gz
  • Upload date:
  • Size: 17.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.9.20

File hashes

Hashes for airflow_provider_tm1-0.2.0.tar.gz
Algorithm Hash digest
SHA256 13bfdec836438dd7abed4d64630772679330c716f3ba497a9b413575af6756af
MD5 077f739ba44baaf495ac87fdfab161a2
BLAKE2b-256 b5308ef075f030ee0ede7cd605fe7b471c690f6fc1a298178cf134dbdea35175

See more details on using hashes here.

File details

Details for the file airflow_provider_tm1-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for airflow_provider_tm1-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6f1226a134cccad391b938ada72dbdbc9867ce67d52150531a1341dc6acbda5a
MD5 8e5fb30a662d18ff6554d85ea9e37c25
BLAKE2b-256 d911aeb052bffd0d8f3619a9dd97b4f9cc5cee7054d4f38d18351375270d8c0a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page