Skip to main content

No project description provided

Project description

Dataverk airflow

Enkelt wrapperbibliotek rundt KubernetesPodOperator som lager Airflow task som kjører i en Kubernetes pod.

Våre operators

Alle våre operators lar deg klone et repo på forhånd, bare legg det til med repo="navikt/<repo>. Vi har også støtte for å installere Python pakker ved oppstart av Airflow task, spesifiser requirements.txt-filen din med requirements_path="/path/to/requirements.txt".

Quarto operator

Denne kjører Quarto render for deg.

from airflow import DAG
from airflow.utils.dates import days_ago
from dataverk_airflow import quarto_operator


with DAG('navn-dag', start_date=days_ago(1), schedule_interval="*/10 * * * *") as dag:
    t1 = quarto_operator(dag=dag,
                         name="<navn-på-task>",
                         repo="navikt/<repo>",
                         quarto={
                             "path": "/path/to/index.qmd",
                             "env": "dev/prod",
                             "id":"uuid",
                             "token":
                             "quarto-token"
                         },
                         slack_channel="<#slack-alarm-kanal>")

Notebook operator

Denne lar deg kjøre en Jupyter notebook.

from airflow import DAG
from airflow.utils.dates import days_ago
from dataverk_airflow import notebook_operator


with DAG('navn-dag', start_date=days_ago(1), schedule_interval="*/10 * * * *") as dag:
    t1 = notebook_operator(dag=dag,
                           name="<navn-på-task>",
                           repo="navikt/<repo>",
                           nb_path="/path/to/notebook.ipynb",
                           slack_channel="<#slack-alarm-kanal>")

Python operator

Denne lar deg kjøre vilkårlig Python-scripts.

from airflow import DAG
from airflow.utils.dates import days_ago
from dataverk_airflow import python_operator


with DAG('navn-dag', start_date=days_ago(1), schedule_interval="*/10 * * * *") as dag:
    t1 = python_operator(dag=dag,
                         name="<navn-på-task>",
                         repo="navikt/<repo>",
                         script_path="/path/to/script.py",
                         slack_channel="<#slack-alarm-kanal>")

Kubernetes operator

Vi tilbyr også vår egen Kubernetes operator som kloner et valg repo inn i containeren.

from airflow import DAG
from airflow.utils.dates import days_ago
from dataverk_airflow import kubernetes_operator


with DAG('navn-dag', start_date=days_ago(1), schedule_interval="*/10 * * * *") as dag:
    t1 = kubernetes_operator(dag=dag,
                             name="<navn-på-task>",
                             repo="navikt/<repo>",
                             cmds=["/path/to/bin/", "script-name.sh", "argument1", "argument2"],
                             image="europe-north1-docker.pkg.dev/nais-management-233d/ditt-team/ditt-image:din-tag",
                             slack_channel="<#slack-alarm-kanal>")

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dataverk_airflow-0.5.5.tar.gz (7.3 kB view details)

Uploaded Source

Built Distribution

dataverk_airflow-0.5.5-py3-none-any.whl (13.3 kB view details)

Uploaded Python 3

File details

Details for the file dataverk_airflow-0.5.5.tar.gz.

File metadata

  • Download URL: dataverk_airflow-0.5.5.tar.gz
  • Upload date:
  • Size: 7.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for dataverk_airflow-0.5.5.tar.gz
Algorithm Hash digest
SHA256 21de5c834e766798e708a8901e7d68b02c3e13acaa852b7ca101eab573a85189
MD5 1ec5b062bdefa02bb96ae252b5aa17b5
BLAKE2b-256 533db37ee479288870eebae1c52d3178bca0817e277f065f20150830d3d9ad92

See more details on using hashes here.

Provenance

File details

Details for the file dataverk_airflow-0.5.5-py3-none-any.whl.

File metadata

File hashes

Hashes for dataverk_airflow-0.5.5-py3-none-any.whl
Algorithm Hash digest
SHA256 5605024e208a664b828cade38e4c0ff3c735157fc7cf3b9e5ad03d5edc37da30
MD5 49ef254781fc089dec3b9c1fa20d6368
BLAKE2b-256 2d0f337aa02bc74c25e58b3e9b4516fc6dc270e59ba238e38f55bab1ea45fd04

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page