Flyte Airflow Provider
Project description
Flyte Provider for Apache Airflow
This package provides an operator, a sensor, and a hook that integrates Flyte into Apache Airflow.
FlyteOperator
is helpful to trigger a task/workflow in Flyte and FlyteSensor
enables monitoring a Flyte execution status
for completion.
Installation
Prerequisites: An environment running apache-airflow
.
pip install airflow-provider-flyte
Configuration
In the Airflow UI, configure a Connection for Flyte.
- Host (required): The FlyteAdmin host.
- Port (optional): The FlyteAdmin port.
- Login (optional):
client_id
- Password (optional):
client_credentials_secret
- Extra (optional): Specify the
extra
parameter as JSON dictionary to provide additional parameters.project
: The default project to connect to.domain
: The default domain to connect to.insecure
: Whether to use SSL or not.command
: The command to execute to return a token using an external process.scopes
: List of scopes to request.auth_mode
: The OAuth mode to use. Defaults to pkce flow.env_prefix
: Prefix that will be used to lookup for injected secrets at runtime.default_dir
: Default directory that will be used to find secrets as individual files.file_prefix
: Prefix for the file in thedefault_dir
.statsd_host
: The statsd host.statsd_port
: The statsd port.statsd_disabled
: Whether to send statsd or not.statsd_disabled_tags
: Turn on to reduce cardinality.local_sandbox_path
- S3 Config:
s3_enable_debug
s3_endpoint
s3_retries
s3_backoff
s3_access_key_id
s3_secret_access_key
- GCS Config:
gsutil_parallelism
Modules
Flyte Operator
The FlyteOperator
requires a flyte_conn_id
to fetch all the connection-related
parameters that are useful to instantiate FlyteRemote
. Also, you must give a
launchplan_name
to trigger a workflow, or task_name
to trigger a task; you can give a
handful of other values that are optional, such as project
, domain
, max_parallelism
,
raw_data_prefix
, kubernetes_service_account
, labels
, annotations
,
secrets
, notifications
, disable_notifications
, oauth2_client
, version
, and inputs
.
Import into your DAG via:
from flyte_provider.operators.flyte import FlyteOperator
Flyte Sensor
If you need to wait for an execution to complete, use FlyteSensor
.
Monitoring with FlyteSensor
allows you to trigger downstream processes only when the Flyte executions are complete.
Import into your DAG via:
from flyte_provider.sensors.flyte import FlyteSensor
Examples
See the examples directory for an example DAG.
Issues
Please file issues and open pull requests here. If you hit any roadblock, hit us up on Slack.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file airflow-provider-flyte-0.0.5.tar.gz
.
File metadata
- Download URL: airflow-provider-flyte-0.0.5.tar.gz
- Upload date:
- Size: 11.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 55aef77e54ef1f8d67d243efd1dab03f53f56dc37b406c8fe506b4635bc6918f |
|
MD5 | efc034ea9361245a7c29ffa7043dc8fb |
|
BLAKE2b-256 | dc50506547795c64e26c2acc85e82230453c181893f5b7f27a84043f62744544 |
File details
Details for the file airflow_provider_flyte-0.0.5-py3-none-any.whl
.
File metadata
- Download URL: airflow_provider_flyte-0.0.5-py3-none-any.whl
- Upload date:
- Size: 14.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 74d2a5c4a8c74438f16c1536a532cb8d1bacae9661c26a0e6da996e1f3540dba |
|
MD5 | 75c7cf5c6d71dbee7218e53789c5230f |
|
BLAKE2b-256 | 4d7dbe72368a5b5baf879e2c1d707fdb0bb8014200745ac61fdfa81531c62762 |