CKAN integration for Dataflows.
Dataflows processors to work with CKAN.
The package use semantic versioning. It means that major versions could include breaking changes. It's recommended to specify
package version range in your
setup/requirements file e.g.
$ pip install dataflows-ckan
These processors have to be used as a part of data flow. For example:
flow = Flow( load('data/data.csv'), dump_to_ckan( host, api_key, owner_org, overwrite_existing_data=True, push_to_datastore=False, push_to_datastore_method='insert', **options, ), ) flow.process()
Saves the DataPackage to a CKAN instance.
Create a virtual environment and install Poetry.
Then install the package in editable mode:
$ make install
Run the tests:
$ make test
Format your code:
$ make format
- Full port to dataflows, and some refactoring, with a basic integration test.
- an initial port from https://github.com/frictionlessdata/datapackage-pipelines-ckan based on the great work of @brew and @amercader
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Hashes for dataflows_ckan-0.3.5-py3-none-any.whl