Skip to main content

Easily deploy Airflow for local development

Project description

Dockflow CLI

coverage report pipeline status

This package requires docker to be installed and running on your machine.

Getting started

  1. Install dockflow:
    pip install dockflow

  2. Authenticate docker to pull from private GCR repository:
    2.1. First make sure that your gcloud is authenticated using gcloud auth application-default login --impersonate-service-account=<service-account>.
    2.2. Ensure you are in the correct project using gcloud config set project <project name>
    2.3. Use gcloud auth configure-docker <location>-docker.pkg.dev to configure gcloud as the credential helper for the Artifact Registry domain associated with your repository’s location. For example, gcloud auth configure-docker europe-west1-docker.pkg.dev.

Quickstart

  1. Ensure that your docker file sharing settings allows access to your development directory.
  2. Navigate to the root directory of your dags folder.
  3. Use dockflow config and enter your container repo url excluding the version.
    • This will save the url in a config file.
    • Eg. <location>-docker.<domain>/<project-id>/<repo-id>/<image>
    • If you get permission errors, Ensure that you have artifact reader permission on this registry.
    • This should only be used if the container repo changes.
  4. If the image version tag is not composer-2.9.8-airflow-2.9.3 specify the tag using dockflow start -iv <version>. Alternatively, use the --check-images flag to list the available images at the provided container repo.
  5. Use dockflow start (This will mount the dag folder and start Airflow).
  6. Use the UI to add connections.
    • Admin -> Connections -> Create
    • If you need to connect to a GCP resource, provide values for the following fields:
      • Project Id, e.g. research-se-de
      • Credential configuration file, located at /usr/local/airflow/gcp_credentials.json
      • Impersonation chain, e.g. local-file-ingestor@research-se-de.iam.gserviceaccount.com
  7. Use dockflow refresh to refresh the configs cache or to bundle configs.
  8. Remember to use dockflow stop to shut down the instance to save local machine resources.
    • The state will be persisted in the same directory as the dags folder.
  9. To stop and remove the container use dockflow stop --rm

CloudSQL Proxy

docker run -d \
  -v <PATH_TO_KEY_FILE>:/config \
  -p 127.0.0.1:5432:5432 \
  --network='dockflow' \
  --name='cloudsql' \
  gcr.io/cloudsql-docker/gce-proxy:1.17 /cloud_sql_proxy \
  -instances=<INSTANCE_CONNECTION_NAME>=tcp:0.0.0.0:5432 -credential_file=/config

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dockflow-1.2.0.tar.gz (10.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dockflow-1.2.0-py3-none-any.whl (10.8 kB view details)

Uploaded Python 3

File details

Details for the file dockflow-1.2.0.tar.gz.

File metadata

  • Download URL: dockflow-1.2.0.tar.gz
  • Upload date:
  • Size: 10.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.8.20

File hashes

Hashes for dockflow-1.2.0.tar.gz
Algorithm Hash digest
SHA256 2aff7e51ae210846d0e0a73ebac832ec55fa5d820e9d758e965edb2f7242aade
MD5 2c0a1b50fd5f206cd55fa350c04b9ce2
BLAKE2b-256 1f374b0d58327be295b438440e02b28dfee821a5ac939ca6deee972a9f52aa70

See more details on using hashes here.

File details

Details for the file dockflow-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: dockflow-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 10.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.8.20

File hashes

Hashes for dockflow-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 de5136fcf500b1c56406c78c1b4aa94a05e305e98a35ebac724892328d925bf4
MD5 3a4f77493f515b24372d3a2464e89e7d
BLAKE2b-256 56141b559b821a0f0d989073b091eb952467671f22e3795f3f4cf9ec88449bc1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page