export pleanrio data sets to s3
Project description
Usage
This is both a simple application and a rather complicated application. The actions it performs are straight forward - create a job to export a data set, zip up the generated CSVs, upload them to Amazon S3, and email the requestor.
The application uses django-channels: it’s a new asynchronous back end for Django, thus relieving us of the hassle of running separate instances for celery and flower. It has its own baggage though - in production it has its own server environment and relies on Redis for message passing.
Overall, it’s pretty snappy and I’m confident that channels is the future of Django, and not celery.
To wire this up, it’s a relatively simple install:
# your site/settings.py INSTALLED_APPS = [ 'whatever django and local stuff', 'channels', # necessary to make exporter go 'plenario_exporter_s3', # this app ] CHANNEL_LAYERS = { 'default': { 'BACKEND': 'asgi_redis.RedisChannelLayer', 'CONFIG': { 'hosts': [os.environ.get('REDIS_URL', 'redis://localhost:6379')], }, 'ROUTING': 'plenario_exporter_s3.routing.channel_routing', }, }
You’re also going to need an asgi.py file alongside your wsgi file:
import os import channels.asgi os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'your app.settings') channel_layer = channels.asgi.get_channel_layer()
And in your views, wire up the provided export service function as the async_handler argument to the plenario-core export view:
from plenario_core.views.export import GenericMetaExportView from plenario_exporter_s3.services import create_models_and_start_job from .models import EtlEventMeta export_meta = GenericMetaExportView.as_view( models=[EtlEventMeta], async_handler=create_models_and_start_job)
Development
Fire up a virtualenv and install the dev requirements:
$ python3.6 -m venv .env $ source .env/bin/activate $ pip install -r dev-requirements.txt
To run the tests, in a separate terminal pull in the PostGIS docker image and create the database:
$ docker pull mdillon/postgis $ docker run -d -p 5432:5432 mdillon/postgis $ docker ps ... $ docker exec -it {container hash} /bin/bash ... # su postgres -c psql ... > create database plenario;
You’re also going to need a local Redis server running:
$ docker pull redis $ docker run -d -p 6379:6379 redis
Then all you have to do is run the tests normally:
$ coverage run manage.py test $ coverage report $ flake8
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
File details
Details for the file plenario_exporter_s3-0.0.3-py3-none-any.whl
.
File metadata
- Download URL: plenario_exporter_s3-0.0.3-py3-none-any.whl
- Upload date:
- Size: 21.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0acad7b1cf2112058e2ff985dd3ce5d89f9347da5010f8822fb9b0bee5640c99 |
|
MD5 | 4dc16c695f88aea4d80bc53b249a2ea2 |
|
BLAKE2b-256 | 6c07a2d9051516b6f1657c7509053eed892847a1e7bf63db163e04c87edf6fcc |