Skip to main content

No project description provided

Project description

Dataverk airflow

Enkelt wrapperbibliotek rundt KubernetesPodOperator som lager Airflow task som kjører i en Kubernetes pod.

Våre operators

Alle våre operators lar deg klone et repo på forhånd, bare legg det til med repo="navikt/<repo>. Vi har også støtte for å installere Python pakker ved oppstart av Airflow task, spesifiser requirements.txt-filen din med requirements_path="/path/to/requirements.txt".

Quarto operator

Denne kjører Quarto render for deg.

from airflow import DAG
from airflow.utils.dates import days_ago
from dataverk_airflow import quarto_operator


with DAG('navn-dag', start_date=days_ago(1), schedule_interval="*/10 * * * *") as dag:
    t1 = quarto_operator(dag=dag,
                         name="<navn-på-task>",
                         repo="navikt/<repo>",
                         quarto={
                             "path": "/path/to/index.qmd",
                             "env": "dev/prod",
                             "id":"uuid",
                             "token":
                             "quarto-token"
                         },
                         slack_channel="<#slack-alarm-kanal>")

Notebook operator

Denne lar deg kjøre en Jupyter notebook.

from airflow import DAG
from airflow.utils.dates import days_ago
from dataverk_airflow import notebook_operator


with DAG('navn-dag', start_date=days_ago(1), schedule_interval="*/10 * * * *") as dag:
    t1 = notebook_operator(dag=dag,
                           name="<navn-på-task>",
                           repo="navikt/<repo>",
                           nb_path="/path/to/notebook.ipynb",
                           slack_channel="<#slack-alarm-kanal>")

Python operator

Denne lar deg kjøre vilkårlig Python-scripts.

from airflow import DAG
from airflow.utils.dates import days_ago
from dataverk_airflow import python_operator


with DAG('navn-dag', start_date=days_ago(1), schedule_interval="*/10 * * * *") as dag:
    t1 = python_operator(dag=dag,
                         name="<navn-på-task>",
                         repo="navikt/<repo>",
                         script_path="/path/to/script.py",
                         slack_channel="<#slack-alarm-kanal>")

Kubernetes operator

Vi tilbyr også vår egen Kubernetes operator som kloner et valg repo inn i containeren.

from airflow import DAG
from airflow.utils.dates import days_ago
from dataverk_airflow import kubernetes_operator


with DAG('navn-dag', start_date=days_ago(1), schedule_interval="*/10 * * * *") as dag:
    t1 = kubernetes_operator(dag=dag,
                             name="<navn-på-task>",
                             repo="navikt/<repo>",
                             cmds=["/path/to/bin/", "script-name.sh", "argument1", "argument2"],
                             image="europe-north1-docker.pkg.dev/nais-management-233d/ditt-team/ditt-image:din-tag",
                             slack_channel="<#slack-alarm-kanal>")

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dataverk_airflow-0.5.1.tar.gz (7.3 kB view details)

Uploaded Source

Built Distribution

dataverk_airflow-0.5.1-py3-none-any.whl (13.3 kB view details)

Uploaded Python 3

File details

Details for the file dataverk_airflow-0.5.1.tar.gz.

File metadata

  • Download URL: dataverk_airflow-0.5.1.tar.gz
  • Upload date:
  • Size: 7.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for dataverk_airflow-0.5.1.tar.gz
Algorithm Hash digest
SHA256 2b73cc00dde59f9e4211b85355f54c56c71addece6256b0e90ace8b5b450b74c
MD5 a076ed4253aa7bb52e199755f86e5a0a
BLAKE2b-256 6102d39f76558a8edfc97c05c096fd0a21ea72305e9ee655b2caaedd2aac756d

See more details on using hashes here.

Provenance

File details

Details for the file dataverk_airflow-0.5.1-py3-none-any.whl.

File metadata

File hashes

Hashes for dataverk_airflow-0.5.1-py3-none-any.whl
Algorithm Hash digest
SHA256 151009c55c7c384dfa088a617cd03fd43e282398640dc971fb3f7bf3040e896d
MD5 54fc852169bf240d489eba3290785c95
BLAKE2b-256 1d040287269501b25f997500edbfe4e1fb4d2b10d6b3f81af3b1f928cd8716d6

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page