Skip to main content

Airflow operators to be used with Bigeye. Supporting Airflow version 1.10.15.

Project description

Bigeye Airflow Operators for Airflow Versions 1.10.15

Operators

Create Metric Operator (bigeye_airflow.oerators.create_metric_operator)

The CreateMetricOperator creates metrics from a list of metric configurations provided to the operator. This operator will fill in reasonable defaults like setting thresholds. It authenticates through an Airflow connection ID and offers the option to run the metrics after those metrics have been created. Please review the link below to understand the structure of the configurations.

Create or Update Metric Swagger

Parameters

  1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
  2. warehouse_id: int - The Bigeye source/warehouse id to which the metric configurations will be deployed.
  3. configuration: List[dict] - A list of metric configurations conforming to the following schema.
    schema_name: str
    table_name: str
    column_name: str
    metric_template_id: uuid.UUID
    metric_name: str
    notifications: List[str]
    thresholds: List[dict]
    filters: List[str]
    group_by: List[str]
    user_defined_metric_name: str
    metric_type: SimpleMetricCategory
    default_check_frequency_hours: int
    update_schedule: str
    delay_at_update: str
    timezone: str
    should_backfill: bool
    lookback_type: str
    lookback_days: int
    window_size: str
    _window_size_seconds
    
  4. run_after_upsert: bool - If true it will run the metrics after creation. Defaults to False.

Run Metrics Operator

The RunMetricsOperator will run metrics in Bigeye based on the following:

  1. All metrics for a given table, by providing warehouse ID, schema name and table name.
  2. Any and all metrics, given a list of metric IDs.

Currently, if a list of metric IDs is provided these will be run instead of metrics provided for warehouse_id, schema_name, table_name.

Parameters

  1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
  2. warehouse_id: int - The Bigeye source/warehouse id for which metrics will be run.
  3. schema_name: str - The schema name for which metrics will be run.
  4. table_name: str - The table name for which metrics will be run.
  5. metric_ids: List[int] - The metric ids to run.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

bigeye_airflow1-0.0.28-py3-none-any.whl (12.0 kB view details)

Uploaded Python 3

File details

Details for the file bigeye_airflow1-0.0.28-py3-none-any.whl.

File metadata

File hashes

Hashes for bigeye_airflow1-0.0.28-py3-none-any.whl
Algorithm Hash digest
SHA256 75850d3e7ccaa251d72f29e2a03104190533aeedac216c276cbf07bff152af38
MD5 8e9da297bfef913fc45e2f36ac380790
BLAKE2b-256 a915f7178e00c19c170d1ad46a344cc7121e6a401a87d5bc7e35b431fcd95549

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page