Skip to main content

export pleanrio data sets to s3

Project description

https://travis-ci.org/UrbanCCD-UChicago/plenario-exporter-s3.svg?branch=master https://coveralls.io/repos/github/UrbanCCD-UChicago/plenario-exporter-s3/badge.svg?branch=master

Usage

This is both a simple application and a rather complicated application. The actions it performs are straight forward - create a job to export a data set, zip up the generated CSVs, upload them to Amazon S3, and email the requestor.

The application uses django-channels: it’s a new asynchronous back end for Django, thus relieving us of the hassle of running separate instances for celery and flower. It has its own baggage though - in production it has its own server environment and relies on Redis for message passing.

Overall, it’s pretty snappy and I’m confident that channels is the future of Django, and not celery.

To wire this up, it’s a relatively simple install:

# your site/settings.py
INSTALLED_APPS = [
    'whatever django and local stuff',
    'channels',  # necessary to make exporter go
    'plenario_exporter_s3',  # this app
]

CHANNEL_LAYERS = {
    'default': {
        'BACKEND': 'asgi_redis.RedisChannelLayer',
        'CONFIG': {
            'hosts': [os.environ.get('REDIS_URL', 'redis://localhost:6379')],
        },
        'ROUTING': 'plenario_exporter_s3.routing.channel_routing',
    },
}

You’re also going to need an asgi.py file alongside your wsgi file:

import os
import channels.asgi

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'your app.settings')
channel_layer = channels.asgi.get_channel_layer()

And in your views, wire up the provided export service function as the async_handler argument to the plenario-core export view:

from plenario_core.views.export import GenericMetaExportView
from plenario_exporter_s3.services import create_models_and_start_job

from .models import EtlEventMeta

export_meta = GenericMetaExportView.as_view(
    models=[EtlEventMeta],
    async_handler=create_models_and_start_job)

Development

Fire up a virtualenv and install the dev requirements:

$ python3.6 -m venv .env
$ source .env/bin/activate
$ pip install -r dev-requirements.txt

To run the tests, in a separate terminal pull in the PostGIS docker image and create the database:

$ docker pull mdillon/postgis
$ docker run -d -p 5432:5432 mdillon/postgis
$ docker ps
...
$ docker exec -it {container hash} /bin/bash
...
# su postgres -c psql
...
> create database plenario;

You’re also going to need a local Redis server running:

$ docker pull redis
$ docker run -d -p 6379:6379 redis

Then all you have to do is run the tests normally:

$ coverage run manage.py test
$ coverage report
$ flake8

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

plenario_exporter_s3-0.0.2-py3-none-any.whl (15.5 kB view details)

Uploaded Python 3

File details

Details for the file plenario_exporter_s3-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for plenario_exporter_s3-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 81a03dce507b863a20827ae94ff33caf430ff2c7f6252be1b27c0d8c685634e1
MD5 8cacfea7775729b21e10dfad4cfb732f
BLAKE2b-256 9ce232e552df7a83b97439b0a7beeb65a562011fe528e6ef15640884312a6d2b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page