Skip to main content

A plugin for Apache Airflow to interact with Microsoft Fabric items

Project description

Apache Airflow Plugin for Microsoft Fabric Plugin. 🚀

Introduction

A Python package that helps Data and Analytics engineers trigger run on demand job items of Microsoft Fabric in Apache Airflow DAGs.

Microsoft Fabric is an end-to-end analytics and data platform designed for enterprises that require a unified solution. It encompasses data movement, processing, ingestion, transformation, real-time event routing, and report building. It offers a comprehensive suite of services including Data Engineering, Data Factory, Data Science, Real-Time Analytics, Data Warehouse, and Databases.

How to Use

Install the Plugin

Pypi package: https://pypi.org/project/apache-airflow-microsoft-fabric/

pip install apache-airflow-microsoft-fabric

Prerequisities

Before diving in,

Since custom connection forms aren't feasible in Apache Airflow plugins, use can use Generic connection type. Here's what you need to store:

  1. Connection Id: Name of the connection Id
  2. Connection Type: Generic
  3. Login: The Client ID of your service principal.
  4. Password: The refresh token fetched using Microsoft OAuth.
  5. Extra: { "tenantId": "The Tenant Id of your service principal", "clientSecret": "(optional) The Client Secret for your Entra ID App" "scopes": "(optional) Scopes you used to fetch the refresh token" }

NOTE: Default scopes applied are: https://api.fabric.microsoft.com/Item.Execute.All, https://api.fabric.microsoft.com/Item.ReadWrite.All, offline_access, openid, profile

Operators

MSFabricRunItemOperator

This operator composes the logic for this plugin. It triggers the Fabric item run and pushes the details in Xcom. It can accept the following parameters:

  • workspace_id: The workspace Id.
  • item_id: The Item Id. i.e Notebook and Pipeline.
  • fabric_conn_id: Connection Id for Fabric.
  • job_type: "RunNotebook" or "Pipeline".
  • wait_for_termination: (Default value: True) Wait until the run item.
  • timeout: int (Default value: 60 * 60 * 24 * 7). Time in seconds to wait for the pipeline or notebook. Used only if wait_for_termination is True.
  • check_interval: int (Default value: 60s). Time in seconds to wait before rechecking the refresh status.
  • max_retries: int (Default value: 5 retries). Max number of times to poll the API for a valid response after starting a job.
  • retry_delay: int (Default value: 1s). Polling retry delay.
  • deferrable: Boolean. Use the operator in deferrable mode.
  • job_params: Dict. Parameters to pass into the job.

Features

  • Refresh token rotation:

    Refresh token rotation is a security mechanism that involves replacing the refresh token each time it is used to obtain a new access token. This process enhances security by reducing the risk of stolen tokens being reused indefinitely.

  • Xcom Integration:

    The Fabric run item enriches the Xcom with essential fields for downstream tasks:

    1. run_id: Run Id of the Fabric item.
    2. run_status: Fabric item run status.
      • In Progress: Item run is in progress.
      • Completed: Item run successfully completed.
      • Failed: Item run failed.
      • Disabled: Item run is disabled by a selective refresh.
    3. run_location: The location of item run status.
  • External Monitoring link:

    The operator conveniently provides a redirect link to the Microsoft Fabric item run.

  • Deferable Mode:

    The operator runs in deferrable mode. The operator is deferred until the target status of the item run is achieved.

Sample DAG to use the plugin.

Ready to give it a spin? Check out the sample DAG code below:

from __future__ import annotations

from airflow import DAG
from airflow.providers.microsoft.fabric.operators.run_item import MSFabricRunItemOperator
from airflow.utils.dates import days_ago

default_args = {
    "owner": "airflow",
    "start_date": days_ago(1),
}

with DAG(
    dag_id="fabric_items_dag",
    default_args=default_args,
    schedule_interval="@daily",
    catchup=False,
) as dag:

    run_notebook = MSFabricRunItemOperator(
        task_id="run_fabric_notebook",
        workspace_id="<workspace_id>",
        item_id="<item_id>",
        fabric_conn_id="fabric_conn_id",
        job_type="RunNotebook",
        wait_for_termination=True,
        deferrable=True,
    )

    run_notebook

Feel free to tweak and tailor this DAG to suit your needs!

Contributing

We welcome any contributions:

  • Report all enhancements, bugs, and tasks as GitHub issues
  • Provide fixes or enhancements by opening pull requests in GitHub.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file apache_airflow_providers_microsoft_fabric-0.0.9.tar.gz.

File metadata

File hashes

Hashes for apache_airflow_providers_microsoft_fabric-0.0.9.tar.gz
Algorithm Hash digest
SHA256 97356891308a688a4dcff8f63d33611b9ce0f5868bd8790708500c2bb0a60ee3
MD5 9c358293edd063bac277f5ee301eb4c9
BLAKE2b-256 7d7f3a123743b241ca8a9548d054cbaf71f48a955911de2813fe3c2fe8f36b54

See more details on using hashes here.

File details

Details for the file apache_airflow_providers_microsoft_fabric-0.0.9-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_providers_microsoft_fabric-0.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 3e2eae5f3505f43cf134f5a0b6f23fc1bbfe2c6eb94272411a2dcaf635d02e42
MD5 505bb85e30d1d5d47d95fd56890b837f
BLAKE2b-256 80327b36cb950fcfa9df543cdca5d9533a391ddc374a4ec72336655fe4a00017

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page