Skip to main content

A simplistic task queue and cron-like scheduler for Django

Project description

Django Too Simple Queue

PyPI version Workflow

This packages provides a simplistic task queue and scheduler for Django.

If execution of your tasks is mission critical, do not use this library, and turn to more complex solutions such as Celery, as this package doesn't guarantee task execution nor unique execution.

It is geared towards basic apps, where simplicity primes over reliability. The package offers simple decorator syntax, including cron-like schedules.

Features :

  • no celery/redis/rabbitmq/whatever... just Django !
  • clean decorator syntax to register tasks and schedules
  • simple queuing syntax
  • cron-like scheduling
  • tasks.py autodiscovery
  • django admin integration

Limitations :

  • probably not extremely reliable because of race conditions
  • no multithreading yet (but running multiple workers should work)

Installation

Install the library :

$ pip install django-toosimple-q

Enable the app in settings.py :

INSTALLED_APPS = [
    ...
    'django_toosimple_q',
    ...
]

Quickstart

Tasks need to be registered using the @register_task() decorator. Once registered, they can be added to the queue by calling the .queue() function.

from django_toosimple_q.decorators import register_task

# Register a task
@register_task()
def my_task(name):
    return f"Hello {name} !"

# Enqueue tasks
my_task.queue("John")
my_task.queue("Peter")

Registered tasks can be scheduled from code using this cron-like syntax :

from django_toosimple_q.decorators import register_task, schedule

# Register and schedule tasks
@schedule(cron="30 8 * * *", args=['John'])
@register_task()
def morning_routine(name):
    return f"Good morning {name} !"

To consume the tasks, you need to run at least one worker :

$ python manage.py worker

The workers will take care of adding scheduled tasks to the queue when needed, and will execute the tasks.

The package autoloads tasks.py from all installed apps. While this is the recommended place to define your tasks, you can do so from anywhere in your code.

Advanced usage

Tasks

You can optionnaly give a custom name to your tasks. This is required when your task is defined in a local scope.

@register_task("my_favourite_task")
def my_task(name):
    return f"Good morning {name} !"

You can set task priorities.

@register_task(priority=0)
def my_favourite_task(name):
    return f"Good bye {name} !"

@register_task(priority=1)
def my_other_task(name):
    return f"Hello {name} !"

# Enqueue tasks
my_other_task.queue("John")
my_favourite_task.queue("Peter")  # will be executed before the other one

You can define retries=N and retry_delay=S to retry the task in case of failure. The delay (in second) will double on each failure.

@register_task(retries=10, retry_delay=60)
def send_email():
    ...

You can mark a task as unique=True if the task shouldn't be queued again if already queued with the same arguments. This is usefull for tasks such as cleaning or refreshing.

@register_task(unique=True)
def cleanup():
    ...

cleanup.queue()
cleanup.queue()  # this will be ignored as long as the first one is still queued

You can assign tasks to specific queues, and then have your worker only consume tasks from specific queues using --queue myqueue or --exclude_queue myqueue. By default, workers consume all tasks.

@register_task(queue='long_running')
def long_task():
    ...

@register_task()
def short_task():
    ...

# Then run those with these workers, so that long
# running tasks don't prevent short running tasks
# from being run :
# manage.py worker --exclude_queue long_running
# manage.py worker

Schedules

By default, last_check is set to now() on schedule creation. This means they will only run on next cron occurence. If you need your schedules to be run as soon as possible after initialisation, you can specify last_check=None.

@schedule(cron="30 8 * * *", last_check=None)
@register_task()
def my_task(name):
    return f"Good morning {name} !"

By default, if some crons where missed (e.g. after a server shutdown or if the workers can't keep up with all tasks), the missed tasks will be lost. If you need the tasks to catch up, set catch_up=True.

@schedule(cron="30 8 * * *", catch_up=True)
@register_task()
def my_task(name):
    ...

You may define multiple schedules for the same task. In this case, it is mandatory to specify a unique name :

@schedule(name="morning_routine", cron="30 16 * * *", args=['morning'])
@schedule(name="afternoon_routine", cron="30 8 * * *", args=['afternoon'])
@register_task()
def my_task(time):
    return f"Good {time} John !"

You may get the schedule's cron datetime provided as a keyword argument to the task using the datetime_kwarg argument :

@schedule(cron="30 8 * * *", datetime_kwarg="scheduled_on")
@register_task()
def my_task(scheduled_on):
    return f"This was scheduled for {scheduled_on.isoformat()}."

Management comment

Besides standard django management commands arguments, the management command supports following arguments.

usage: manage.py worker [--queue QUEUE | --exclude_queue EXCLUDE_QUEUE]
                        [--tick TICK]
                        [--once | --until_done]
                        [--no_recreate | --recreate_only]

optional arguments:
  --queue QUEUE         which queue to run (can be used several times, all
                        queues are run if not provided)
  --exclude_queue EXCLUDE_QUEUE
                        which queue not to run (can be used several times, all
                        queues are run if not provided)
  --tick TICK           frequency in seconds at which the database is checked
                        for new tasks/schedules
  --once                run once then exit (useful for debugging)
  --until_done          run until no tasks are available then exit (useful for
                        debugging)
  --no_recreate         do not (re)populate the schedule table (useful for
                        debugging)
  --recreate_only       populates the schedule table then exit (useful for
                        debugging)

Contrib apps

django_toosimple_q.contrib.mail

A queued email backend to send emails asynchronously, preventing your website from failing completely in case the upstream backend is down.

Installation

Enable and configure the app in settings.py :

INSTALLED_APPS = [
    ...
    'django_toosimple_q.contrib.mail',
    ...
]

EMAIL_BACKEND = 'django_toosimple_q.contrib.mail.backends.QueueBackend'

# Actual Django email backend used, defaults to django.core.mail.backends.smtp.EmailBackend, see https://docs.djangoproject.com/en/3.2/ref/settings/#email-backend
TOOSIMPLEQ_EMAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend'

Dev

Tests

To run tests locally (by default, tests runs against an in-memory sqlite database):

$ pip install -r requirements-dev.txt
$ python manage.py test

To run tests against postgres, run the following commands before :

# Start a local postgres database
$ docker run -p 5432:5432 -e POSTGRES_PASSWORD=postgres -d postgres
# Set and env var
$ export TOOSIMPLEQ_TEST_DB=postgres # on Windows: `$Env:TOOSIMPLEQ_TEST_DB = "postgres"`

Tests are run automatically on github.

Contribute

Code style is done with pre-commit :

$ pip install -r requirements-dev.txt
$ pre-commit install

Changelog

  • 2021-07-15 : v0.3.0

    • added contrib.mail
    • task replacement now tracked with a FK instead of a state
    • also run tests on postgres
    • added datetime_kwarg argument to schedules
  • 2021-06-11 : v0.2.0

    • added retries, retry_delay options for tasks
    • improve logging
  • 2020-11-12 : v0.1.0

    • fixed bug where updating schedule failed
    • fixed worker not doing all available tasks for each tick
    • added --tick argument
    • enforce uniqueness of schedule

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

django-toosimple-q-0.3.1.tar.gz (19.2 kB view details)

Uploaded Source

Built Distribution

django_toosimple_q-0.3.1-py3-none-any.whl (24.5 kB view details)

Uploaded Python 3

File details

Details for the file django-toosimple-q-0.3.1.tar.gz.

File metadata

  • Download URL: django-toosimple-q-0.3.1.tar.gz
  • Upload date:
  • Size: 19.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.7.11

File hashes

Hashes for django-toosimple-q-0.3.1.tar.gz
Algorithm Hash digest
SHA256 deb6d345f939ad636b16c35a27f16dd57cb852ec26f650a8b010b6ef7ea95607
MD5 6a156bd2ddd721e20c0aecb3f502f614
BLAKE2b-256 5b8fada6f06735b97a6f6a44da1c20b020df3b72d0b9fbb7313aea789bb6e8a2

See more details on using hashes here.

File details

Details for the file django_toosimple_q-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: django_toosimple_q-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 24.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.7.11

File hashes

Hashes for django_toosimple_q-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8c6cd2f5bd056dbdb56249ce1bd71d556fc0a159ad73fdd68e4b88867c8d2770
MD5 742391aeea5cd1fe1b77d425ce09f28f
BLAKE2b-256 14e10a76ece429da7bd6ee0145a9ad8e86f6fefb0921553e485ca193ebcfc2b5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page