A lightweight asynchronous Python job executor backed by Redis.
Project description
just-jobs
A lightweight asynchronous Python job executor. Using Redis by default (but not exclusivly, via custom adapters), it is a smaller and production-ready alternative to Celery for applications where distributed microservices are overkill.
Usage
Documentation: https://justjobs.thearchitector.dev.
The entire execution structure consists of 3 things:
- The
Manager
, which is responsible for managing the broker and all job queues. - The
Broker
, which is responsible for integrating into a storage interface and executing jobs. - A
job
, which is any non-dynamic function or coroutine that performs some task.
In general, the process for enqueue jobs for execution is always the same:
- Create a Manager and tell it to start listening for jobs via
await manager.startup()
. - Anywhere in your application, enqueue a job via
manager.enqueue(job, *args, **kwargs)
. - Ensure to properly shutdown your manager with
await manager.shutdown()
.
Example
A common use case for delayed jobs is a web application, where milliseconds are important. Here is an example using FastAPI, whose startup and shutdown hooks make it easier for us to manage the state of our Manager.
from fastapi import FastAPI
from just_jobs import Manager
app = FastAPI()
async def _essential_task(a, b):
"""render a movie, or email a user, or both"""
@app.on_event("startup")
async def startup():
# the default broker is backed by Redis via aioredis. Managers
# will always pass any args and kwargs it doesn't recognize to
# their brokers during startup.
manager = Manager(url="redis://important-redis-server/0")
app.state.manager = manager
await manager.startup()
@app.on_event("shutdown")
async def shutdown():
# this is absolutely essential to allow the manager to shutdown
# all the listening workers, as well as for the broker to do any
# cleanup or disconnects it should from its backing storage inferface.
await app.state.manager.shutdown()
@app.get("/do_thing")
async def root():
# enqueue the task so it gets run in a worker's process queue
await app.state.manager.enqueue(_essential_task, 2, 2)
return {"message": "The thing is being done!"}
License
This software is licensed under the BSD 2-Clause “Simplified” License.
This package is Treeware. If you use it in production, consider buying the world a tree to thank me for my work. By contributing to my forest, you’ll be creating employment for local families and restoring wildlife habitats.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file just-jobs-1.1.0.tar.gz
.
File metadata
- Download URL: just-jobs-1.1.0.tar.gz
- Upload date:
- Size: 8.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.13 CPython/3.9.2 Windows/10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 655ae4f5277cc8b3c35d34ca2ecb3254e740fbc2e20dbfbdb83b034105b12fed |
|
MD5 | 63cc187299124657c5cc8011c34282b7 |
|
BLAKE2b-256 | c4fa819d9209e91080e733c18a27a4b5dd0a7fdaa42ba917f6ab0a8545bb1191 |
File details
Details for the file just_jobs-1.1.0-py3-none-any.whl
.
File metadata
- Download URL: just_jobs-1.1.0-py3-none-any.whl
- Upload date:
- Size: 10.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.13 CPython/3.9.2 Windows/10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 77e8559b139cd41f82038e90c8f200217c91b0c46148ec8a7edca56c1078953f |
|
MD5 | 0c9d198fac0d9f35a5a375d1d2728fb8 |
|
BLAKE2b-256 | 8801d54a378db3237ae417324490bcc0457d0746c6b9e278e7eba65debd50273 |