Skip to main content

A simple data ingestion library to guide data flows from some places to other places

Project description

Viadot

Build

A simple data ingestion library to guide data flows from some places to other places

Getting Data from a Source

viadot supports few sources. For instance, the UK Carbon Intensity API does not require credentials.

from viadot.sources.uk_carbon_intensity import UKCarbonIntensity
ukci = UKCarbonIntensity()
ukci.query("/intensity")
ukci.to_df()

The above code pulls the UK Carbon Insentity data from the external API to the local Pandas dataframe (df).

Loading Data to a Source

TODO

Running tests

run.sh
docker exec -it viadot_testing bash
cd tests/ && pytest .

Running flows locally

run.sh
poetry shell
FLOW_NAME=supermetrics_to_azure_sql; python -m viadot.flows.$FLOW_NAME

Uploading pkg to PyPi

Generate the requirements.txt file from poetry.

poetry export -f requirements.txt --output requirements.txt --with-credentials --dev

And then publish with poetry.

poetry update
poetry publish --build

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

viadot-0.1.9.tar.gz (12.0 kB view hashes)

Uploaded Source

Built Distribution

viadot-0.1.9-py3-none-any.whl (19.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page