CKAN integration for Dataflows.
Project description
dataflows-ckan
Dataflows processors to work with CKAN.
Features
dump_to_ckan
processor
Contents
Getting Started
Installation
The package use semantic versioning. It means that major versions could include breaking changes. It's recommended to specify package
version range in your setup/requirements
file e.g. package>=1.0,<2.0
.
$ pip install dataflows-ckan
Examples
These processors have to be used as a part of data flow. For example:
flow = Flow(
load('data/data.csv'),
dump_to_ckan(
host,
api_key,
owner_org,
overwrite_existing_data=True,
push_to_datastore=False,
push_to_datastore_method='insert',
**options,
),
)
flow.process()
Documentation
dump_to_ckan
Saves the DataPackage to a CKAN instance.
Contributing
Create a virtual environment and install Poetry.
Then install the package in editable mode:
$ make install
Run the tests:
$ make test
Format your code:
$ make format
Changelog
0.2.0
- Full port to dataflows, and some refactoring, with a basic integration test.
0.1.0
- an initial port from https://github.com/frictionlessdata/datapackage-pipelines-ckan based on the great work of @brew and @amercader
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
dataflows-ckan-0.3.5.tar.gz
(5.4 kB
view hashes)
Built Distribution
Close
Hashes for dataflows_ckan-0.3.5-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3a9d6630b2d437839329bdd78c3641216f3081390541c51167e575cc3db9429e |
|
MD5 | 772b15cdcdccf25778e9042b55fb36db |
|
BLAKE2b-256 | 996b72eb9213ccddcb2c60ed759c80ebd00ce66dfe8da3ec46c0991cab2935ba |