Polars and duckdb based json configured simple ETL pipelines
Project description
endurance_etl
Polars and duckdb based json configured simple ETL pipelines
WARNING: PLACEHOLDER FOR LATER DEVELOPMENT
Introduction
Use endurance_etl to execute some simple ETL pipelines:
import json
from endurance_etl import Tars
CONFIG = "sample.json"
with open(CONFIG) as f:
print(json.load(f))
# Output:
{
"SOURCES": [
{
"name": "csv_file_source",
"path": "csv_file_source.csv",
# ...other_kwargs
}
],
"TARGETS": [
{
"name": "csv_file_target",
"source": "source/csv_file_source",
"target": "csv_file_target.csv",
"transforms": [
{
"function": "lambda df: df + 1"
}
]
# ...other_kwargs
}
]
}
tars = Tars.from_json(CONFIG)
tars.do()
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
endurance_etl-0.1.0.tar.gz
(1.9 kB
view hashes)
Built Distribution
Close
Hashes for endurance_etl-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 58dffc4b7fa31c53faae22c670b3407fc7402d20ddcc36dfbf126b1466ac9594 |
|
MD5 | ce191ee39e29c4deddacbf98179e634d |
|
BLAKE2b-256 | e7a4e71ec00d6247a69c6c0abfa7c746f16bd320a2cf88f59e2b237ca99ff21a |