A simple data ingestion library to guide data flows from some places to other places
Project description
Viadot
A simple data ingestion library to guide data flows from some places to other placesGetting Data from a Source
viadot supports few sources. For instance, the UK Carbon Intensity API does not require credentials.
from viadot.sources.uk_carbon_intensity import UKCarbonIntensity
ukci = UKCarbonIntensity()
ukci.query("/intensity")
ukci.to_df()
The above code pulls the UK Carbon Insentity data from the external API to the local Pandas dataframe (df).
Loading Data to a Source
TODO
Running tests
run.sh
docker exec -it viadot_testing bash
cd tests/ && pytest .
Running flows locally
run.sh
poetry shell
FLOW_NAME=supermetrics_to_azure_sql; python -m viadot.flows.$FLOW_NAME
Uploading pkg to PyPi
Generate the requirements.txt
file from poetry.
poetry export -f requirements.txt --output requirements.txt --with-credentials --dev
And then publish with poetry.
poetry update
poetry publish --build
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
viadot-0.1.9.tar.gz
(12.0 kB
view hashes)
Built Distribution
viadot-0.1.9-py3-none-any.whl
(19.3 kB
view hashes)