a simple but robust task queue
Project description
delayed
Delayed is a simple but robust task queue inspired by rq.
Features
- Robust: all the enqueued tasks will run exactly once, even if the worker got killed at any time.
- Clean: finished tasks (including failed) won't take the space of your Redis.
- Distributed: workers as more as needed can run in the same time without further config.
Requirements
- Python 2.7 or later, tested on Python 2.7, 3.3 - 3.7 and PyPy 3.5. All the processes of a task queue should use the same version of Python.
- Most UNIX-like systems (with os.fork() and select.poll() implemented), tested on Ubuntu and macOS.
- Redis 2.6.0 or later.
- Keep syncing time among all the machines of a task queue.
Getting started
-
Run a redis server:
$ redis-server
-
Install delayed:
$ pip install delayed
-
Create a task queue:
import redis from delayed.queue import Queue conn = redis.Redis() queue = Queue(name='default', conn=conn)
-
Three ways to enqueue a task:
-
Define a task function and enqueue it:
from delayed.delay import delayed delayed = delayed(queue) @delayed() def delayed_add(a, b): return a + b delayed_add.delay(1, 2) # enqueue delayed_add delayed_add.delay(1, b=2) # same as above delayed_add(1, 2) # call it immediately
-
Directly enqueue a function:
from delayed.delay import delay delay = delay(queue) def add(a, b): return a + b delay(add)(1, 2) delay(add)(1, b=2) # same as above
-
Enqueue a predefined task function without importing it:
from delayed.task import Task task = Task(id=None, module_name='test', func_name='add', args=(1, 2)) queue.enqueue(task)
-
-
Run a task worker (or more) in a separated process:
import redis from delayed.queue import Queue from delayed.worker import ForkedWorker conn = redis.Redis() queue = Queue(name='default', conn=conn) worker = ForkedWorker(queue=queue) worker.run()
-
Run a task sweeper in a separated process to recovery lost tasks (mainly due to the worker got killed):
import redis from delayed.queue import Queue from delayed.sweeper import Sweeper conn = redis.Redis() queue = Queue(name='default', conn=conn) sweeper = Sweeper(queue=queue) sweeper.run()
QA
-
Q: What's the limitation on a task function?
A: A task function should be defined in module level (except the__main__
module). Itsargs
andkwargs
should be picklable. -
Q: What's the
name
param of a queue?
A: It's the key used to store the tasks of the queue. A queue with name "default" will use those keys:- default: list, enqueued tasks.
- default_id: str, the next task id.
- default_noti: list, the same length as enqueued tasks.
- default_enqueued: sorted set, enqueued tasks with their timeouts.
- default_dequeued: sorted set, dequeued tasks with their dequeued timestamps.
-
Q: Why the worker is slow?
A: TheForkedWorker
forks a new process for each new task. So all the tasks are isolated and you won't leak memory.
To reduce the overhead of forking processes and importing modules, if your task function code won't be changed in the worker's lifetime, you can switch toPreforkedWorker
:import redis from delayed.queue import Queue from delayed.worker import PreforkedWorker conn = redis.Redis() queue = Queue(name='default', conn=conn) worker = PreforkedWorker(queue=queue) worker.run()
-
Q: How does a
ForkedWorker
run?
A: It runs such a loop:- It dequeues a task from the queue periodically.
- It forks a child process to run the task.
- It kills the child process if the child runs out of time.
- When the child process exits, it releases the task.
-
Q: How does a
PreforkedWorker
run?
A: It runs such a loop:- It dequeues a task from the queue periodically.
- If it has no child process, it forks a new one.
- It sends the task through a pipe to the child.
- It kills the child process if the child runs out of time.
- When the child process exits or it received result from the pipe, it releases the task.
-
Q: How does the child process of a worker run?
A: the child of aForkedWorker
just runs the task, unmarks the task as dequeued, then exits. The child of aPreforkedWorker
runs such a loop:- It tries to receive a task from the pipe.
- If the pipe has been closed, it exits.
- It runs the task.
- It sends the task result to the pipe.
- It releases the task.
-
Q: What's lost tasks?
A: There are 2 situations a task might get lost:- a worker popped a task notification, then got killed before dequeueing the task.
- a worker dequeued a task, then both the monitor and its child process got killed before they releasing the task.
-
Q: How to recovery lost tasks?
A: Run a sweeper. It dose two things:- it keeps the task notification length the same as the task queue.
- it moves the timeout dequeued tasks back to the task queue.
-
Q: How to set the timeout of tasks?
A: You can set thedefault_timeout
of a queue ortimeout
of a task:from delayed.delay import delay_in_time queue = Queue('default', conn, default_timeout=60) delayed_add.timeout(10)(1, 2) delay_in_time = delay_in_time(queue) delay_in_time(add, timeout=10)(1, 2)
-
Q: How to handle the finished tasks?
A: Set thesuccess_handler
anderror_handler
of the worker. The handlers would be called in a forked process, except the forked process got killed or the monitor process raised an exception.def success_handler(task): logging.info('task %d finished', task.id) def error_handler(task, kill_signal, exc_info): if kill_signal: logging.error('task %d got killed by signal %d', task.id, kill_signal) else: logging.exception('task %d failed', exc_info=exc_info) worker = PreforkedWorker(Queue, success_handler=success_handler, error_handler=error_handler)
-
Q: Why does sometimes both
success_handler
anderror_handler
be called for a single task?
A: When the child process got killed after thesuccess_handler
be called, or the monitor process got killed but the child process still finished the task, both handlers would be called. You can consider it as successful. -
Q: How to turn on the debug logs?
A: Add alogging.DEBUG
level handler todelayed.logger.logger
. The simplest way is to calldelayed.logger.setup_logger()
:from delayed.logger import setup_logger setup_logger()
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.