Skip to main content

A CLI to work with DataHub metadata

Project description

Introduction to Metadata Ingestion

:::tip Find Integration Source Please see our Integrations page to browse our ingestion sources and filter on their features. :::

Integration Methods

DataHub offers three methods for data ingestion:

  • UI Ingestion : Easily configure and execute a metadata ingestion pipeline through the UI.
  • CLI Ingestion guide : Configure the ingestion pipeline using YAML and execute by it through CLI.
  • SDK-based ingestion : Use Python Emitter or Java emitter to programmatically control the ingestion pipelines.

Types of Integration

Integration can be divided into two concepts based on the method:

Push-based Integration

Push-based integrations allow you to emit metadata directly from your data systems when metadata changes. Examples of push-based integrations include Airflow, Spark, Great Expectations and Protobuf Schemas. This allows you to get low-latency metadata integration from the "active" agents in your data ecosystem.

Pull-based Integration

Pull-based integrations allow you to "crawl" or "ingest" metadata from the data systems by connecting to them and extracting metadata in a batch or incremental-batch manner. Examples of pull-based integrations include BigQuery, Snowflake, Looker, Tableau and many others.

Core Concepts

The following are the core concepts related to ingestion:

  • Sources: Data systems from which extract metadata. (e.g. BigQuery, MySQL)
  • Sinks: Destination for metadata (e.g. File, DataHub)
  • Recipe: The main configuration for ingestion in the form or .yaml file

For more advanced guides, please refer to the following:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file acryl_datahub_hcc_patched-1.4.0.9.post2-py3-none-any.whl.

File metadata

File hashes

Hashes for acryl_datahub_hcc_patched-1.4.0.9.post2-py3-none-any.whl
Algorithm Hash digest
SHA256 219ac3e2cc2e027e2f470768175ccf2eddd373f455b528b01eec194f26848a8c
MD5 c4520b6f0f1a7b50dbcd5f21eafad29f
BLAKE2b-256 ce6c1400fe0e6c38889082d3b1e57281c458c38003ed8477090421910519e752

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page