Skip to main content

No project description provided

Project description

gke-taskflow

Adds support for Taskflow to the GKEStartPodOperator in Airflow.
This allows us to write cleaner, more pythonic DAGs.

Installation

This backage will need to be installed into the same environment as Airflow in order to function correctly. For Cloud Composer, we'll have to follow the docs.

PIP

Execute the following:

pip install gke-taskflow

Use

After the package is installed, you can define your task using the @task.gke_pod decorator:

from datetime import datetime
from airflow.decorators import dag, task

@dag(
        schedule=None,
        start_date=datetime(2023, 7, 15),
        tags=["testing"]
)
def example_dag_taskflow():
    @task.gke_pod(
            image="python:3.10-slim",
            task_id="test_flow",
            name="test_flow",
            cluster_name="test-cluster",
            namespace="composer-internal",
            location="us-central1",
            project_id="test-cluster-123abc",
    )
    def hello_world_from_container():
        print("hello world from container")

    hello_world_from_container()

example_dag_taskflow()

The keyword arguments supplied to @task.gke_pod are identical to those supplied to Google's GKEStartPodOperator, on which this work is based. The docs for that class are scant, but the source code is available online for review.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gke-taskflow-1.0.0.tar.gz (9.2 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page