A very simple async job queue with a database
Project description
DBJobQ
Database-backed Job Queue for Python. Supports multiple storage backends: SQLAlchemy, MongoDB, Redis, DynamoDB.
Installation
# Basic installation (includes scheduling support)
pip install dbjobq
# Or install with specific storage backend(s)
pip install dbjobq[sqlalchemy] # SQLAlchemy with async drivers
pip install dbjobq[mongo] # MongoDB
pip install dbjobq[redis] # Redis
pip install dbjobq[dynamo] # DynamoDB
# Or install all storage backends
pip install dbjobq[all]
Development Installation
# Install uv if not already
curl -LsSf https://astral.sh/uv/install.sh | sh
# Clone the project
git clone <repo-url>
cd dbjobq
# Sync dependencies with all extras and dev tools
uv sync --all-extras
Usage
Define Jobs
from dbjobq import job
@job
def my_background_task(data):
print(f"Processing {data}")
# Do work
Create Job Queue
from dbjobq import JobQueue, Worker
from dbjobq.storage import SQLAlchemyStorage
# For SQLAlchemy
storage = SQLAlchemyStorage('sqlite:///jobs.db')
job_queue = JobQueue(storage)
# Enqueue a job
job_id = job_queue.enqueue(my_background_task, {'key': 'value'})
# Inspect the queue
pending_jobs = job_queue.get_pending_jobs()
running_jobs = job_queue.get_running_jobs()
completed_jobs = job_queue.get_completed_jobs()
failed_jobs = job_queue.get_failed_jobs()
# Get a specific job
job = job_queue.get_job(job_id)
if job:
print(f"Job {job.id}: {job.type} - {job.status}")
# List all jobs or filter by status
all_jobs = job_queue.list_jobs()
pending_only = job_queue.list_jobs(status="pending", limit=10)
# Start a worker
worker = Worker(job_queue)
worker.start()
# Later, stop the worker
worker.stop()
Storage Backends
- SQLAlchemy: Supports any SQL database.
- MongoDB:
MongoStorage(mongo_url, db_name) - Redis:
RedisStorage(redis_url) - DynamoDB:
DynamoStorage(table_name, region_name)
Features
- Cross-process job locking
- Multiple storage backends
- Simple API similar to Celery
- Suitable for web apps like FastAPI with Gunicorn
Development
# Run tests
uv run pytest
# Run the example
uv run python hello.py
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
dbjobq-0.1.0.tar.gz
(197.1 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
dbjobq-0.1.0-py3-none-any.whl
(22.7 kB
view details)
File details
Details for the file dbjobq-0.1.0.tar.gz.
File metadata
- Download URL: dbjobq-0.1.0.tar.gz
- Upload date:
- Size: 197.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4133187ba4b50b0e3413acdacb1b317996834b05e7a6012c57a1b801419ae9aa
|
|
| MD5 |
b8ef99c2bbe3196248ea4432b28d285c
|
|
| BLAKE2b-256 |
f9da189751c0573f8cc5c823770e1eb3f8de75f5b370fb253ca4205f25ed1491
|
File details
Details for the file dbjobq-0.1.0-py3-none-any.whl.
File metadata
- Download URL: dbjobq-0.1.0-py3-none-any.whl
- Upload date:
- Size: 22.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d85ff223fb7302500ed45dd6f613f4ef4c72c168104d1b803a6cfbcf15f832e5
|
|
| MD5 |
24ced5baa75e0ca5942801a3e874062d
|
|
| BLAKE2b-256 |
4b1e6f874d74b23cbe8dc817a72560867f180c429e30ca684615232527dbc7aa
|