Skip to main content

PyAirbyte

Project description

PyAirbyte

PyAirbyte brings the power of Airbyte to every Python developer. PyAirbyte provides a set of utilities to use Airbyte connectors in Python. It is meant to be used in situations where setting up an Airbyte server or cloud account is not possible or desirable.

Getting Started

Watch this Getting Started Loom video or run one of our Quickstart tutorials below to see how you can use PyAirbyte in your python code.

Secrets Management

PyAirbyte can auto-import secrets from the following sources:

  1. Environment variables.
  2. Variables defined in a local .env ("Dotenv") file.
  3. Google Colab secrets.
  4. Manual entry via getpass.

Note: Additional secret store options may be supported in the future. More info here.

Retrieving Secrets

from airbyte import get_secret, SecretSource

source = get_source("source-github")
source.set_config(
   "credentials": {
      "personal_access_token": get_secret("GITHUB_PERSONAL_ACCESS_TOKEN"),
   }
)

The get_secret() function accepts an optional source argument of enum type SecretSource. If omitted or set to SecretSource.ANY, PyAirbyte will search all available secrets sources. If source is set to a specific source, then only that source will be checked. If a list of SecretSource entries is passed, then the sources will be checked using the provided ordering.

By default, PyAirbyte will prompt the user for any requested secrets that are not provided via other secret managers. You can disable this prompt by passing prompt=False to get_secret().

Connector compatibility

To make a connector compatible with PyAirbyte, the following requirements must be met:

  • The connector must be a Python package, with a pyproject.toml or a setup.py file.
  • In the package, there must be a run.py file that contains a run method. This method should read arguments from the command line, and run the connector with them, outputting messages to stdout.
  • The pyproject.toml or setup.py file must specify a command line entry point for the run method called source-<connector name>. This is usually done by adding a console_scripts section to the pyproject.toml file, or a entry_points section to the setup.py file. For example:
[tool.poetry.scripts]
source-my-connector = "my_connector.run:run"
setup(
    ...
    entry_points={
        'console_scripts': [
            'source-my-connector = my_connector.run:run',
        ],
    },
    ...
)

To publish a connector to PyPI, specify the pypi section in the metadata.yaml file. For example:

data:
 # ...
 remoteRegistries:
   pypi:
     enabled: true
     packageName: "airbyte-source-my-connector"

Validating source connectors

To validate a source connector for compliance, the airbyte-lib-validate-source script can be used. It can be used like this:

airbyte-lib-validate-source —connector-dir . -—sample-config secrets/config.json

The script will install the python package in the provided directory, and run the connector against the provided config. The config should be a valid JSON file, with the same structure as the one that would be provided to the connector in Airbyte. The script will exit with a non-zero exit code if the connector fails to run.

For a more lightweight check, the --validate-install-only flag can be used. This will only check that the connector can be installed and returns a spec, no sample config required.

Contributing

To learn how you can contribute to PyAirbyte, please see our PyAirbyte Contributors Guide.

Frequently asked Questions

1. Does PyAirbyte replace Airbyte? No.

2. What is the PyAirbyte cache? Is it a destination? Yes, you can think of it as a built-in destination implementation, but we avoid the word "destination" in our docs to prevent confusion with our certified destinations list here.

3. Does PyAirbyte work with data orchestration frameworks like Airflow, Dagster, and Snowpark, Yes, it should. Please give it a try and report any problems you see. Also, drop us a note if works for you!

4. Can I use PyAirbyte to develop or test when developing Airbyte sources? Yes, you can, but only for Python-based sources.

5. Can I develop traditional ETL pipelines with PyAirbyte? Yes. Just pick the cache type matching the destination - like SnowflakeCache for landing data in Snowflake.

6. Can PyAirbyte import a connector from a local directory that has python project files, or does it have to be pip install Yes, PyAirbyte can use any local install that has a CLI - and will automatically find connectors by name if they are on PATH.

Changelog and Release Notes

For a version history and list of all changes, please see our GitHub Releases page.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airbyte-0.8.3.tar.gz (66.3 kB view details)

Uploaded Source

Built Distribution

airbyte-0.8.3-py3-none-any.whl (84.6 kB view details)

Uploaded Python 3

File details

Details for the file airbyte-0.8.3.tar.gz.

File metadata

  • Download URL: airbyte-0.8.3.tar.gz
  • Upload date:
  • Size: 66.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.8

File hashes

Hashes for airbyte-0.8.3.tar.gz
Algorithm Hash digest
SHA256 235e638a99e022a896f0bbb57807744427113d352779d8a7932a86740c362837
MD5 5f36d332b2abbb4ce3ce6ff71e476b4b
BLAKE2b-256 ed18f77709c6cebd4906bef97cccfc26cc73ddd54dddac01e89c017a1e956a9e

See more details on using hashes here.

File details

Details for the file airbyte-0.8.3-py3-none-any.whl.

File metadata

  • Download URL: airbyte-0.8.3-py3-none-any.whl
  • Upload date:
  • Size: 84.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.8

File hashes

Hashes for airbyte-0.8.3-py3-none-any.whl
Algorithm Hash digest
SHA256 f67f3f4ebc034d74ee760b298dfc2a5308afe17fa7109bf16c227c8e506c669e
MD5 6ef78bbf0937aa4943db670eaea6a372
BLAKE2b-256 85519ef25f19416d9b37d1a15bd2184601468cfc6a907c0ae87243b99f1b26fc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page