Skip to main content

Apache Airflow operator for running Google Cloud Run Jobs using green energy

Project description

logo
VertFlow

Run Docker containers on Airflow using green energy

Video Demo

📖 About

VertFlow is an Airflow operator for running Cloud Run Jobs on Google Cloud Platform in green data centres.
Cloud Run is a serverless container runtime, meaning you BYO Docker image and emit carbon only when the job is running. This is easier, cheaper and greener than managing a Kubernetes cluster spinning 24/7.

Not all data centres are created equal.
Data centres run on electricity generated from various sources, including fossil fuels which emit harmful carbon emissions. Some data centres are greener than others, using electricity from renewable sources such as wind and hydro.
When you deploy a container on Airflow using the VertFlow operator, it will run your container in the greenest GCP data centre possible.

ℹ️ Use VertFlow on Cloud Composer 2 to save even more money and CO2.

🔧 How to install

  1. pip install VertFlow on your Airflow instance.
  2. Ensure your Airflow scheduler has outbound access to the public internet and the roles/run.developer Cloud IAM role.
  3. Get an API Key for CO2 Signal, free for non-commercial use. Store in an Airflow variable called VERTFLOW_API_KEY.

ℹ️ If you're using Cloud Composer, these instructions may be helpful:

🖱 How to use

Use the VertFlowOperator to instantiate a task in your DAG. Provide:

  • The address of the Docker image to run.
  • A runtime specification, e.g. timeout and memory limits.
  • A set of allowed regions to run the job in, based on latency, data governance and other considerations. VertFlow picks the greenest one.
from VertFlow.operator import VertFlowOperator
from airflow import DAG

with DAG(
        dag_id="hourly_dag_in_green_region",
        schedule_interval="@hourly"
) as dag:
    task = VertFlowOperator(
        image_address="us-docker.pkg.dev/cloudrun/container/job:latest",
        name="hello-world",
        allowed_regions=["europe-west1", "europe-west4"],
        command="echo",
        arguments=["Hello World"],
        service_account_email_address="my-service-account@embroidered-elephant-739.iam.gserviceaccount.com",
        ...
    )

🔌🗺 Shout out to CO2 Signal

VertFlow works thanks to real-time global carbon intensity data, gifted to the world for non-commercial use by CO2 Signal.

🤝 How to contribute

Found a bug or fancy resolving an issue? We welcome Pull Requests!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

VertFlow-0.1.6.tar.gz (15.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

VertFlow-0.1.6-py3-none-any.whl (18.9 kB view details)

Uploaded Python 3

File details

Details for the file VertFlow-0.1.6.tar.gz.

File metadata

  • Download URL: VertFlow-0.1.6.tar.gz
  • Upload date:
  • Size: 15.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.9

File hashes

Hashes for VertFlow-0.1.6.tar.gz
Algorithm Hash digest
SHA256 01e97310f2df4656902adfc2e1011ed89f761df196066dac944579f918031f68
MD5 a7c5a641fe4cef6279920487af1c8478
BLAKE2b-256 6bd54d0135604c4a08f0fdafff948db706793f1358fc1350f78e0df8d6e6d360

See more details on using hashes here.

File details

Details for the file VertFlow-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: VertFlow-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 18.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.9

File hashes

Hashes for VertFlow-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 8419654c1c1153983c40505a155e2e3d22a0517a53d8540cf8b645efbac3c21c
MD5 03d95e15b603d128a5c539b326877070
BLAKE2b-256 061f1af4e41de6a5ff61b016e9080f00c2f5b439c320f790f4ae6e9653a7a5ff

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page