A lightweight asynchronous Python job executor backed by Redis.
Project description
just-jobs
A lightweight asynchronous Python job executor. Using Redis by default (but not exclusivly, via custom adapters), it is a smaller and production-ready alternative to Celery for applications where distributed microservices are overkill.
Usage
Documentation: https://thearchitector.github.io/just-jobs/just_jobs/.
The entire execution structure consists of 3 things:
- The
Manager
, which is responsible for managing the broker and all job queues. - The
Broker
, which is responsible for integrating into a storage interface and executing jobs. - A
job
, which is any non-dynamic function or coroutine that performs some task.
In general, the process for enqueue jobs for execution is always the same:
- Create a Manager and tell it to start listening for jobs via
await manager.startup()
. - Anywhere in your application, enqueue a job via
manager.enqueue(job, *args, **kwargs)
. - Ensure to properly shutdown your manager with
await manager.shutdown()
.
Example
A common use case for delayed jobs is a web application, where milliseconds are important. Here is an example using FastAPI, whose startup and shutdown hooks make it easier for us to manage the state of our Manager.
from fastapi import FastAPI
from just_jobs import Manager
app = FastAPI()
async def _essential_task(a, b):
"""render a movie, or email a user, or both"""
@app.on_event("startup")
async def startup():
# the default broker is backed by Redis via aioredis. Managers
# will always pass any args and kwargs it doesn't recognize to
# their brokers during startup.
manager = Manager(url="redis://important-redis-server/0")
app.state.manager = manager
await manager.startup()
@app.on_event("shutdown")
async def shutdown():
# this is absolutely essential to allow the manager to shutdown
# all the listening workers, as well as for the broker to do any
# cleanup or disconnects it should from its backing storage inferface.
await app.state.manager.shutdown()
@app.get("/do_thing")
async def root():
# enqueue the task so it gets run in a worker's process queue
await app.state.manager.enqueue(_essential_task, 2, 2)
return {"message": "The thing is being done!"}
License
This software is licensed under the BSD 2-Clause “Simplified” License.
This package is Treeware. If you use it in production, then we ask that you buy the world a tree to thank us for our work. By contributing to my forest you’ll be creating employment for local families and restoring wildlife habitats.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file just-jobs-1.0.0.tar.gz
.
File metadata
- Download URL: just-jobs-1.0.0.tar.gz
- Upload date:
- Size: 8.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.6 CPython/3.9.2 Windows/10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7ff11dcbfd19c73b7d4b896e758badd9857d8d562abe7d3d4ddacb594bb20bb9 |
|
MD5 | 1df614e4e31742d0b918999eaa4f8798 |
|
BLAKE2b-256 | effe0360b1ab32bf552264ad7de967ad41983780afc464320cfbbc5a8d0b53e8 |
File details
Details for the file just_jobs-1.0.0-py3-none-any.whl
.
File metadata
- Download URL: just_jobs-1.0.0-py3-none-any.whl
- Upload date:
- Size: 9.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.6 CPython/3.9.2 Windows/10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | bb51a26a90854a75d1eeb0e01ccc657b09066013ccc7753eae8c074983ba6c13 |
|
MD5 | 271365e6a13871818a65fdefe29eddf2 |
|
BLAKE2b-256 | c6dd6745d97ecd134f35779148d4269489d7e825d10b7fddd34d4b09e8269af9 |