Skip to main content

Airflow extension for communicating with Wherobots Cloud

Project description

Airflow Providers for Wherobots

Airflow providers to bring Wherobots Cloud's spatial compute to your data workflows and ETLs.

Installation

If you use Poetry in your project, add the dependency with poetry add:

$ poetry add airflow-providers-wherobots

Otherwise, just pip install it:

$ pip install airflow-providers-wherobots

Usage

Create a connection

You first need to create a Connection in Airflow. This can be done from the UI, or from the command-line. The default Wherobots connection name is wherobots_default; if you use another name you must specify that name with the wherobots_conn_id parameter when initializing Wherobots operators.

The only required fields for the connection are:

  • the Wherobots API endpoint in the host field;
  • your Wherobots API key in the password field.
$ airflow connections add "wherobots_default" \
    --conn-type "generic" \
    --conn-host "api.cloud.wherobots.com" \
    --conn-password "$(< api.key)"

Execute a SQL query

The WherobotsSqlOperator allows you to run SQL queries on the Wherobots cloud, from which you can build your ETLs and data transformation workflows by querying, manipulating, and producing datasets with WherobotsDB.

Refer to the Wherobots Documentation and this guidance to learn how to read data, transform data, and write results in Spatial SQL with WherobotsDB.

Refer to the Wherobots Apache Airflow Provider Documentation to get more detailed guidance about how to use the Wherobots Apache Airflow Provider.

Example

Below is an example Airflow DAG that executes a SQL query on Wherobots Cloud:

import datetime

from airflow import DAG
from airflow_providers_wherobots.operators.sql import WherobotsSqlOperator


with DAG(
    dag_id="example_wherobots_sql_dag",
    start_date=datetime.datetime.now(),
    schedule="@hourly",
    catchup=False
):
    # Create a `wherobots.test.airflow_example` table with 100 records
    # from the OMF `places_place` dataset.
    operator = WherobotsSqlOperator(
        task_id="execute_query",
        sql=f"""
        INSERT INTO wherobots.test.airflow_example
        SELECT id, geometry, confidence, geohash
        FROM wherobots_open_data.overture.places_place
        LIMIT 100
        """,
        return_last=False,
    )

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow_providers_wherobots-0.1.10.tar.gz (11.0 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file airflow_providers_wherobots-0.1.10.tar.gz.

File metadata

File hashes

Hashes for airflow_providers_wherobots-0.1.10.tar.gz
Algorithm Hash digest
SHA256 e9204f6c8d51f146c692da1a401fc74de46fd66caa3f19df7ca0c348fb1bae55
MD5 472cf6161d2981fb0a59fb779cfd23f0
BLAKE2b-256 e59fb40abe9bb8ee0d002bc8e9a4ee8be228b7e9f5f842995c8595ff09e53fea

See more details on using hashes here.

File details

Details for the file airflow_providers_wherobots-0.1.10-py3-none-any.whl.

File metadata

File hashes

Hashes for airflow_providers_wherobots-0.1.10-py3-none-any.whl
Algorithm Hash digest
SHA256 96bfad10b1c65ef90ae02ef15cd33c3d3fe9a2b00c4092e6f1dafcfd746ecb87
MD5 4d5835802235f5b625cdc0218d194bb0
BLAKE2b-256 d3ebe8b1ea193aa4fa4486c9185d3045ac73e51f55a49eb0225e41e8acf44a4a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page