Skip to main content
Help us improve PyPI by participating in user testing. All experience levels needed!

A simple Django app to offload tasks from main web server

Project description

# Django Leek

[![Build Status](](
![Code Coverage](

The _simple_ and _slick_ way to run async tasks in Django.

* Django-friendly API
* Easy to start and stop

Based on [django-queue](

## Why?
With a healthy mix of vegetables, such as [Celery]( and [Carrot]( aleady in the midst, what does `django-leek` bring?

The most "lightweight" library so far has "install Redis" as step one. Although, Redis is a fantastic software, sometimes you just want a simple way of offload the webserver and run a task async, such as sending an email.

Here `django-leek` comes to the rescue. Usage and architecture cannot be simpler, and with so few moving parts, it should be very stable, although it's still not battle tested as e.g. Celery.

With `django-leek` you can get up and running quickly The more complex distributed queues can wait until the website has a lot of traffic, and the scalability is really required.

## Getting started
1. Install `django-leek` with pip

$ pip install django-leek

2. Add `django-leek` to `INSTALLED_APPS` in your `` file.

3. Create tables needed

$ migrate

4. Make sure the django-leek server is running.

$ python runleek

5. Go nuts

leek = Leek()
def send_mail(to):


You can also use the "old" as found in `django-queue`
push_task_to_queue(send_mail, to='')

6. It's easy to unit test code that in production offloads work to the Leek server.

def _invoke(a_callable, *args, **kwargs):
+ a_callable(*args, **kwargs)
@patch('django_leek.api.push_task_to_queue', _invoke)
def test_mytest():
send_mail.offload(to='') # now runs synchronously, like a normal function

## Technical overview
In a nutshell, a python SocketServer runs in the background, listening on a tcp socket. SocketServer gets the request to run a task from it's socket, puts the task on a Queue. A Worker thread picks tasks from this Queue, and runs the tasks one by one.

### Components

1. Python SocketServer that listens to a tcp socket.
2. A Worker thread.
3. A python Queue

### Workflow
The workflow that runs an async task:

1. When `SocketServer` starts, it initializes the `Worker` thread.
2. `SocketServer` listens to requests.
3. When `SocketServer` receives a request - a callables with args and kwargs - it puts the request on a python `Queue`.
4. The `Worker` thread picks a task from the `Queue`.
5. The `Worker` thread runs the task.

### Can this queue scale to production?
Depends on the traffic: SocketServer is simple, but solid, and as the site gets more traffic, it's possible to move the django-queue server to another machine, separate database etc. At some point, probably, it's better to pick Celery. Until then, django-leek is a simple, solid, and no-hustle solution.

## Settings
To change the default django-queue settings, add a `LEEK` dictionary to your project main `` file.

This is the dictionary and the defaults:

LEEK = {
'bind': "localhost:8002",
'host': "localhost",
'port': 8002}

The leek server will bind here.

The django server will connect to this host when notifying leek of jobs.

The django server will connect to this port when notifying leek of jobs.

## Persistence
The following models are used.

The model saves every tasks pushed to the queue.
The task is pickled as a `tasks_queue.tasks.Task` object, which is a simple class with a `callable`,`args` and `kwargs` attributes, and one method: `run()`

The Worker thread saves to this model the `task_id` of every task that was carried out successfuly. **task_id** is the task's `QueuedTasks` id.

After the Worker tries to run a task and it fails by raising an exception, the Worker saves it to this model. The failed taks is saved by the `task_id`, with the exception message. Only the exception from the last run is saved.

### Purge Tasks

According to your project needs, you can purge tasks that the Worker completed successfuly.

The SQL to delete these tasks:

DELETE queued,success
FROM tasks_queue_queuedtasks queued
INNER JOIN tasks_queue_successtasks success
ON success.task_id =;

In a similar way, delete the failed tasks.
You can run a cron script, or other script, to purge the tasks.

## Authors
Aviah and Samuel Carlsson

See [contributors]( for full list.

Project details

Release history Release notifications

This version
History Node


History Node


History Node


History Node


History Node


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
django_leek-0.6-py3-none-any.whl (11.6 kB) Copy SHA256 hash SHA256 Wheel py3 Jun 1, 2018
django-leek-0.6.tar.gz (7.7 kB) Copy SHA256 hash SHA256 Source None Jun 1, 2018

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging CloudAMQP CloudAMQP RabbitMQ AWS AWS Cloud computing Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page