Event sourcing and handling
Project description
==========================================
django-spark - Event sourcing and handling
==========================================
.. image:: https://travis-ci.org/matthiask/django-spark.png?branch=master
:target: https://travis-ci.org/matthiask/django-spark
Version |release|
This is not supposed to be real documentation; it's more a reminder for
myself.
The idea is that there are event sources and event handlers. Event
sources may create a stream of ``spark.api.Event`` instances, where each
event must have a ``group`` and a ``key``. Additional data may be added
to the ``Event`` as well. Keys are globally unique -- events with the
same key are still only processed exactly once. Groups are used to
determine which handlers handle a certain event.
Event handlers are functions which are called once per
``spark.api.Event`` instance if the event's group matches the event
handler's regex.
Some usage example code
=======================
Given a challenge, create events for the challenge (the specifics do not
matter):
.. code-block:: python
from datetime import date
from spark import api
def events_from_challenge(challenge):
if not challenge.is_active:
return
yield {
"group": 'challenge_created',
"key": 'challenge_created_%s' % challenge.pk,
"context": {"challenge": challenge},
}
if (date.today() - challenge.start_date).days > 2:
if challenge.donations.count() < 2:
yield {
"group": 'challenge_inactivity_2d',
"key": 'challenge_inactivity_2d_%s' % challenge.pk,
"context": {"challenge": challenge},
}
if (challenge.end_date - date.today()).days <= 2:
yield {
"group": 'challenge_ends_2d',
"key": 'challenge_ends_2d_%s' % challenge.pk,
"context": {"challenge": challenge},
}
if challenge.end_date < date.today():
yield {
"group": 'challenge_ended',
"key": 'challenge_ended_%s' % challenge.pk,
"context": {"challenge": challenge},
}
Send mails related to challenges (uses django-authlib's
``render_to_mail``):
.. code-block:: python
from authlib.email import render_to_mail
def send_challenge_mails(event):
challenge = event["context"]["challenge"]
render_to_mail(
# Different mail text per event group:
"challenges/mails/%s" % event["group"],
{
"challenge": challenge,
},
to=[challenge.user.email],
).send(fail_silently=True)
Register the handlers:
.. code-block:: python
class ChallengesConfig(AppConfig):
def ready(self):
# Prevent circular imports:
from spark import api
api.register_group_handler(
handler=send_challenge_mails,
group=r'^challenge',
)
Challenge = self.get_model('Challenge')
# All this does right now is register a post_save signal
# handler which runs the challenge instance through
# events_from_challenge:
api.register_model_event_source(
sender=Challenge,
source=events_from_challenge,
)
Now, events are generated and handled directly in process.
Alternatively, you might want to handle events outside the
request-response cycle. This can be achieved by only registering the
model event source e.g. in a management command, and then sending all
model instances through all event sources, and directly processing those
events, for example like this:
.. code-block:: python
from spark import api
api.register_model_event_source(...)
# Copied from the process_spark_sources management command inside
# this repository
for model, sources in api.MODEL_SOURCES.items():
for instance in model.objects.all():
for source in sources:
api.process_events(api.only_new_events(source(instance)))
- `Documentation <https://django-spark.readthedocs.io>`_
- `Github <https://github.com/matthiask/django-spark/>`_
django-spark - Event sourcing and handling
==========================================
.. image:: https://travis-ci.org/matthiask/django-spark.png?branch=master
:target: https://travis-ci.org/matthiask/django-spark
Version |release|
This is not supposed to be real documentation; it's more a reminder for
myself.
The idea is that there are event sources and event handlers. Event
sources may create a stream of ``spark.api.Event`` instances, where each
event must have a ``group`` and a ``key``. Additional data may be added
to the ``Event`` as well. Keys are globally unique -- events with the
same key are still only processed exactly once. Groups are used to
determine which handlers handle a certain event.
Event handlers are functions which are called once per
``spark.api.Event`` instance if the event's group matches the event
handler's regex.
Some usage example code
=======================
Given a challenge, create events for the challenge (the specifics do not
matter):
.. code-block:: python
from datetime import date
from spark import api
def events_from_challenge(challenge):
if not challenge.is_active:
return
yield {
"group": 'challenge_created',
"key": 'challenge_created_%s' % challenge.pk,
"context": {"challenge": challenge},
}
if (date.today() - challenge.start_date).days > 2:
if challenge.donations.count() < 2:
yield {
"group": 'challenge_inactivity_2d',
"key": 'challenge_inactivity_2d_%s' % challenge.pk,
"context": {"challenge": challenge},
}
if (challenge.end_date - date.today()).days <= 2:
yield {
"group": 'challenge_ends_2d',
"key": 'challenge_ends_2d_%s' % challenge.pk,
"context": {"challenge": challenge},
}
if challenge.end_date < date.today():
yield {
"group": 'challenge_ended',
"key": 'challenge_ended_%s' % challenge.pk,
"context": {"challenge": challenge},
}
Send mails related to challenges (uses django-authlib's
``render_to_mail``):
.. code-block:: python
from authlib.email import render_to_mail
def send_challenge_mails(event):
challenge = event["context"]["challenge"]
render_to_mail(
# Different mail text per event group:
"challenges/mails/%s" % event["group"],
{
"challenge": challenge,
},
to=[challenge.user.email],
).send(fail_silently=True)
Register the handlers:
.. code-block:: python
class ChallengesConfig(AppConfig):
def ready(self):
# Prevent circular imports:
from spark import api
api.register_group_handler(
handler=send_challenge_mails,
group=r'^challenge',
)
Challenge = self.get_model('Challenge')
# All this does right now is register a post_save signal
# handler which runs the challenge instance through
# events_from_challenge:
api.register_model_event_source(
sender=Challenge,
source=events_from_challenge,
)
Now, events are generated and handled directly in process.
Alternatively, you might want to handle events outside the
request-response cycle. This can be achieved by only registering the
model event source e.g. in a management command, and then sending all
model instances through all event sources, and directly processing those
events, for example like this:
.. code-block:: python
from spark import api
api.register_model_event_source(...)
# Copied from the process_spark_sources management command inside
# this repository
for model, sources in api.MODEL_SOURCES.items():
for instance in model.objects.all():
for source in sources:
api.process_events(api.only_new_events(source(instance)))
- `Documentation <https://django-spark.readthedocs.io>`_
- `Github <https://github.com/matthiask/django-spark/>`_
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
django-spark-0.3.0.tar.gz
(12.1 kB
view details)
Built Distribution
File details
Details for the file django-spark-0.3.0.tar.gz
.
File metadata
- Download URL: django-spark-0.3.0.tar.gz
- Upload date:
- Size: 12.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.19.1 setuptools/40.4.3 requests-toolbelt/0.8.0 tqdm/4.25.0 CPython/2.7.15rc1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3628469898775786116f7363228dcd4d7c7f6403306a070500972af71f6b4af3 |
|
MD5 | c678d4b8488ef6d600fb80a7ffff8328 |
|
BLAKE2b-256 | a77c45d2b29f270286867896f00c5f4748c41008c1a006859689b9928671ad90 |
File details
Details for the file django_spark-0.3.0-py2.py3-none-any.whl
.
File metadata
- Download URL: django_spark-0.3.0-py2.py3-none-any.whl
- Upload date:
- Size: 17.0 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.19.1 setuptools/40.4.3 requests-toolbelt/0.8.0 tqdm/4.25.0 CPython/2.7.15rc1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 02033ad7c16c0d94e3e82bfec8e0653c3eb1e69b554a91c8d606ddfbbc0ffff6 |
|
MD5 | dc41b9b1ab5ccb6e355dfe4d7beb6586 |
|
BLAKE2b-256 | bc87caccafc969a7c47730df98afc1ede4abaf5cf495844a11725b94679c1ec9 |