Skip to main content

A simplistic task queue and cron-like scheduler for Django

Project description

Django Too Simple Queue

PyPI version Workflow

This packages provides a simplistic task queue and scheduler for Django.

If execution of your tasks is mission critical, do not use this library, and turn to more complex solutions such as Celery, as this package doesn't guarantee task execution nor unique execution.

It is geared towards basic apps, where simplicity primes over reliability. The package offers simple decorator syntax, including cron-like schedules.

Features :

  • no celery/redis/rabbitmq/whatever... just Django !
  • clean decorator syntax to register tasks and schedules
  • simple queuing syntax
  • cron-like scheduling
  • tasks.py autodiscovery
  • django admin integration

Limitations :

  • probably not extremely reliable because of race conditions
  • no multithreading yet (but running multiple workers should work)

Installation

Install the library :

$ pip install django-toosimple-q

Enable the app in settings.py :

INSTALLED_APPS = [
    ...
    'django_toosimple_q',
    ...
]

Quickstart

Tasks need to be registered using the @register_task() decorator. Once registered, they can be added to the queue by calling the .queue() function.

from django_toosimple_q.decorators import register_task

# Register a task
@register_task()
def my_task(name):
    return f"Hello {name} !"

# Enqueue tasks
my_task.queue("John")
my_task.queue("Peter")

Registered tasks can be scheduled from code using this cron-like syntax :

from django_toosimple_q.decorators import register_task, schedule

# Register and schedule tasks
@schedule(cron="30 8 * * *", args=['John'])
@register_task()
def morning_routine(name):
    return f"Good morning {name} !"

To consume the tasks, you need to run at least one worker :

$ python manage.py worker

The workers will take care of adding scheduled tasks to the queue when needed, and will execute the tasks.

The package autoloads tasks.py from all installed apps. While this is the recommended place to define your tasks, you can do so from anywhere in your code.

Advanced usage

Tasks

You can optionnaly give a custom name to your tasks. This is required when your task is defined in a local scope.

@register_task("my_favourite_task")
def my_task(name):
    return f"Good morning {name} !"

You can set task priorities.

@register_task(priority=0)
def my_favourite_task(name):
    return f"Good bye {name} !"

@register_task(priority=1)
def my_other_task(name):
    return f"Hello {name} !"

# Enqueue tasks
my_other_task.queue("John")
my_favourite_task.queue("Peter")  # will be executed before the other one

You can mark a task as unique=True if the task shouldn't be queued again if already queued with the same arguments. This is usefull for tasks such as cleaning or refreshing.

@register_task(unique=True)
def cleanup():
    ...

cleanup.queue()
cleanup.queue()  # this will be ignored as long as the first one is still queued

You can assign tasks to specific queues, and then have your worker only consume tasks from specific queues using --queue myqueue or --exclude_queue myqueue. By default, workers consume all tasks.

@register_task(queue='long_running')
def long_task():
    ...

@register_task()
def short_task():
    ...

# Then run those with these workers, so that long
# running tasks don't prevent short running tasks
# from being run :
# manage.py worker --exclude_queue long_running
# manage.py worker

Schedules

By default, last_run is set to now() on schedule creation. This means they will only run on next cron occurence. If you need your schedules to be run as soon as possible after initialisation, you can specify last_run=None.

@schedule(cron="30 8 * * *", last_run=None)
@register_task()
def my_task(name):
    return f"Good morning {name} !"

By default, if some crons where missed (e.g. after a server shutdown or if the workers can't keep up with all tasks), the missed tasks will be lost. If you need the tasks to catch up, set catch_up=True.

@schedule(cron="30 8 * * *", catch_up=True)
@register_task()
def my_task(name):
    ...

You may define multiple schedules for the same task. In this case, it is mandatory to specify a unique name :

@schedule(name="morning_routine", cron="30 16 * * *", args=['morning'])
@schedule(name="afternoon_routine", cron="30 8 * * *", args=['afternoon'])
@register_task()
def my_task(time):
    return f"Good {time} John !"

Dev

Tests

$ python manage.py test

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

django-toosimple-q-0.0.3.tar.gz (13.3 kB view details)

Uploaded Source

Built Distribution

django_toosimple_q-0.0.3-py3-none-any.whl (16.0 kB view details)

Uploaded Python 3

File details

Details for the file django-toosimple-q-0.0.3.tar.gz.

File metadata

  • Download URL: django-toosimple-q-0.0.3.tar.gz
  • Upload date:
  • Size: 13.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for django-toosimple-q-0.0.3.tar.gz
Algorithm Hash digest
SHA256 2f11e4b1e06d0e18e8b65ea6d97c4cff068a5a1b04d610f038eb41aa6ac55b11
MD5 b50787b79b317fe48693ebc6269d6db6
BLAKE2b-256 23b922b86dcbe780c9d682d587eda48474100b350bc5daaa328ca1c8780073b0

See more details on using hashes here.

File details

Details for the file django_toosimple_q-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: django_toosimple_q-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 16.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for django_toosimple_q-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 16f96478ef717ccad35cc17afb2b82bcb298f455d03772cb7bdf84aef5e2c62c
MD5 1142e9a631a75c922761712786a0073d
BLAKE2b-256 8d5446cd9047e95d335e76441cc0df145d0e8a567ef8510e9eefe62dea74db14

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page