Provider for Apache Airflow. Implements apache-airflow-providers-sktvane package by skt
Project description
apache-airflow-providers-sktvane
- AIDP 가 제공하는 자원들에 접근하는 용도
NES
BigQuery
Vault
- 기타 공용 목적의 코드
PyPI
Environments
Local
VAULT_TOKEN
은 관련 문서 에서 확인export VAULT_ADDR=https://vault-public.sktai.io export VAULT_TOKEN={{VAULT_TOKEN}} export AIRFLOW__CORE__DAGS_FOLDER=.
Deployment
main
브랜치에push
이벤트 발생 시 배포, 부득이하게 로컬 환경에서 배포할 경우 아래 명령 수행# build $ python setup.py sdist bdist_wheel # upload $ twine upload dist/* # remove $ rm -rf build dist apache_airflow_providers_sktvane.egg-info
Components
Operators
airflow.providers.sktvane.operators.nes.NesOperator
: AIDP 의NES
사용from airflow.providers.sktvane.operators.nes import NesOperator ... NesOperator( task_id="jupyter_daily_count", input_nb="https://github.com/sktaiflow/notebooks/blob/master/statistics/jupyter_daily_count.ipynb", parameters={"current_date": "{{ ds }}", "channel": "#aim-statistics"}, )
Sensors
-
airflow.providers.sktvane.sensors.gcp.BigqueryPartitionSensor
: AIDP 의BigQuery
파티션 체크from airflow.providers.sktvane.sensors.gcp import BigqueryPartitionSensor ... BigqueryPartitionSensor( task_id=f"{table}_partition_sensor", dataset_id="wind_tmt", table_id=table, partition="dt = '{{ds}}'", )
Macros
-
airflow.providers.sktvane.macros.slack.send_fail_message
: AIDP 정의 포맷으로Slack
에러 메시지 발송from airflow.providers.sktvane.macros.slack import send_fail_message ... def send_aidp_fail_message(slack_email: str) -> None: send_fail_message( slack_channel="#aidp-airflow-monitoring", slack_username=f"Airflow-AlarmBot-{env}", slack_email=slack_email, )
-
airflow.providers.sktvane.macros.gcp.bigquery_client
: AIDP 의BigQuery
사용from airflow.providers.sktvane.macros.gcp import bigquery_client ... def bq_query_to_bq(query, dest_table_name, **kwarg): bq_client = bigquery_client() job = bq_client.query(query) job.result()
-
airflow.providers.sktvane.macros.vault.get_secrets
: AIDP 의Vault
사용from airflow.providers.sktvane.macros.vault import get_secrets ... def get_hive_conn(): from pyhive import hive hiveserver2 = get_secrets(path="ye/hiveserver2") host = hiveserver2["ip"] port = hiveserver2["port"] user = hiveserver2["user"] conn = hive.connect(host, port=port, username=user) return conn
-
airflow.providers.sktvane.macros.date.ds_nodash_plus_days
: AIDP 에서 제공하는date
유틸리티from airflow.providers.sktvane.macros.date import ds_nodash_plus_days ... def ds_nodash_tomorrow(ds): ds_nodash_plus_days(ds, 1)
-
airflow.providers.sktvane.macros.date.ds_nodash_minus_days
:ds_nodash_plus_days
와 동일 -
airflow.providers.sktvane.macros.date.ym_nodash_add_month
:ds_nodash_plus_days
와 동일 -
airflow.providers.sktvane.macros.date.first_day_of_this_month
:ds_nodash_plus_days
와 동일 -
airflow.providers.sktvane.macros.date.last_day_of_this_month
:ds_nodash_plus_days
와 동일 -
airflow.providers.sktvane.macros.date.get_latest_loaded_dt
:ds_nodash_plus_days
와 동일
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Hashes for apache-airflow-providers-sktvane-1.2.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 360852c52fbe98498c701fdfef1424f32d57cef7f312794e1424e6a41b27af7e |
|
MD5 | aa223624c3153a0f443d4723f9434799 |
|
BLAKE2b-256 | f4c511f6b2c42c5efb7c4a8710a2a5c096edf18a61952b8a84ff85374eb7e01a |