Skip to main content

MBQ Atomiq

Project description

mbq.atomiq: database-backed queues
==================================

.. image:: https://img.shields.io/pypi/v/mbq.atomiq.svg
:target: https://pypi.python.org/pypi/mbq.atomiq

.. image:: https://img.shields.io/pypi/l/mbq.atomiq.svg
:target: https://pypi.python.org/pypi/mbq.atomiq

.. image:: https://img.shields.io/pypi/pyversions/mbq.atomiq.svg
:target: https://pypi.python.org/pypi/mbq.atomiq

.. image:: https://img.shields.io/travis/managedbyq/mbq.atomiq/master.svg
:target: https://travis-ci.org/managedbyq/mbq.atomiq

Installation
------------

.. code-block:: bash

$ pip install mbq.atomiq


Getting started
---------------

1. Add `mbq.atomiq` to `INSTALLED_APPS` in your django application's settings

2. Add `ATOMIQ` specific settings to that same settings file. Those are used for metrics.

.. code-block:: python

ATOMIQ = {
'env': CURRENT_ENV,
'service': YOUR_SERICE_NAME,
}

3. Set up consumers for each queue type that your app needs. `mbq.atomiq` provides a handy management command for that:

.. code-block:: bash

python -m manage atomic_run_consumer --queue sns

python -m manage atomic_run_consumer --queue sqs

python -m manage atomic_run_consumer --queue celery

note that atomiq will use the celery task `name` attribute to import and call the task. By default, celery sets the task name to be the `path.to.task.module.task_function_name`. Overriding the name of a task will cause atomiq to break, so plz don't do this.

To make sure we're not holding on to successfully executed or deleted tasks we also have a clean up management command, that by default will clean up all processed tasks that are older than 30 days. That default can be overriden.

.. code-block:: bash

python -m manage atomic_cleanup_old_tasks

or

python -m manage atomic_cleanup_old_tasks --days N

or

python -m manage atomic_cleanup_old_tasks --minutes N

4. Use it!

.. code-block:: python

import mbq.atomiq

mbq.atomiq.sns_publish(topic_arn, message)

mbq.atomiq.sqs_publish(queue_url, message)

mbq.atomiq.celery_publish(celery_task, *task_args, **task_kwargs)

Monitoring
----------
<https://app.datadoghq.com/dash/895710/atomiq>


Testing
-------
Tests are automatically in ``Travis CI https://travis-ci.org/managedbyq/mbq.atomiq`` but you can also run tests locally using ``docker-compose``.
We now use `tox` for local testing across multiple python environments. Before this use ``pyenv`` to install the following python interpreters: cpython{2.7, 3.5, 3.6} and pypy3

.. code-block:: bash

$ docker-compose up py36|py27|py37|pypy3
Testing in Other Services
-------------------------
When using atomiq in other services, we don't want to mock out atomiq's publish functions. This is because atomiq includes functionality to check that all usages are wrapped in a transaction, and can account for transactions added by Django in test cases. To allow you to test that the tasks you expect have been added the queue, we expose a `test_utils` module.


Shipping a New Release
----------------------
1. Bump the version in ``__version__.py``
2. Go to ``Releases`` in GitHub and "Draft a New Release"
3. After creating a new release, Travis CI will pick up the new release and ship it to PyPi

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mbq.atomiq-0.0.13.tar.gz (16.6 kB view details)

Uploaded Source

File details

Details for the file mbq.atomiq-0.0.13.tar.gz.

File metadata

  • Download URL: mbq.atomiq-0.0.13.tar.gz
  • Upload date:
  • Size: 16.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.20.0 setuptools/40.5.0 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.6.3

File hashes

Hashes for mbq.atomiq-0.0.13.tar.gz
Algorithm Hash digest
SHA256 2d9fc1e01b3fef37acaaf92862e71d93d0228165492cd69d14c5618915b3563b
MD5 11de0bb6bb554895c12ecf69552e69e7
BLAKE2b-256 51cad753764a89f24d0494f46095204cf68835485293a68633ea3c76bfa7885d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page