Skip to main content

Apache Airflow operator for running Google Cloud Run Jobs using green energy

Project description

logo
VertFlow

Run Docker containers on Airflow using green energy

Video Demo

📖 About

VertFlow is an Airflow operator for running Cloud Run Jobs on Google Cloud Platform in green data centres.
Cloud Run is a serverless container runtime, meaning you BYO Docker image and emit carbon only when the job is running. This is easier, cheaper and greener than managing a Kubernetes cluster spinning 24/7.

Not all data centres are created equal.
Data centres run on electricity generated from various sources, including fossil fuels which emit harmful carbon emissions. Some data centres are greener than others, using electricity from renewable sources such as wind and hydro.
When you deploy a container on Airflow using the VertFlow operator, it will run your container in the greenest GCP data centre possible.

ℹ️ Use VertFlow on Cloud Composer 2 to save even more money and CO2.

🔧 How to install

  1. pip install VertFlow on your Airflow instance.
  2. Ensure your Airflow scheduler has outbound access to the public internet and the roles/run.developer Cloud IAM role.
  3. Get an API Key for CO2 Signal, free for non-commercial use. Store in an Airflow variable called VERTFLOW_API_KEY.

ℹ️ If you're using Cloud Composer, these instructions may be helpful:

🖱 How to use

Use the VertFlowOperator to instantiate a task in your DAG. Provide:

  • The address of the Docker image to run.
  • A runtime specification, e.g. timeout and memory limits.
  • A set of allowed regions to run the job in, based on latency, data governance and other considerations. VertFlow picks the greenest one.
from VertFlow.operator import VertFlowOperator
from airflow import DAG

with DAG(
        dag_id="hourly_dag_in_green_region",
        schedule_interval="@hourly"
) as dag:
    task = VertFlowOperator(
        image_address="us-docker.pkg.dev/cloudrun/container/job:latest",
        name="hello-world",
        allowed_regions=["europe-west1", "europe-west4"],
        command="echo",
        arguments=["Hello World"],
        service_account_email_address="my-service-account@embroidered-elephant-739.iam.gserviceaccount.com",
        ...
    )

🔌🗺 Shout out to CO2 Signal

VertFlow works thanks to real-time global carbon intensity data, gifted to the world for non-commercial use by CO2 Signal.

🤝 How to contribute

Found a bug or fancy resolving an issue? We welcome Pull Requests!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

VertFlow-0.2.0.tar.gz (15.4 kB view hashes)

Uploaded Source

Built Distribution

VertFlow-0.2.0-py3-none-any.whl (19.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page