Pgqueuer is a Python library leveraging PostgreSQL for efficient job queuing.
Project description
🚀 PGQueuer - Building Smoother Workflows One Queue at a Time 🚀
📚 Documentation: Explore the Docs 🔍 Source Code: View on GitHub 💬 Join the Discussion: Discord Community
PGQueuer is a minimalist, high-performance job queue library for Python, leveraging PostgreSQL's robustness. Designed with simplicity and efficiency in mind, PGQueuer offers real-time, high-throughput processing for background jobs using PostgreSQL's LISTEN/NOTIFY and FOR UPDATE SKIP LOCKED
mechanisms.
Features
- 💡 Simple Integration: PGQueuer seamlessly integrates with any Python application using PostgreSQL, providing a clean and lightweight interface.
- ⚛️ Efficient Concurrency Handling: Supports
FOR UPDATE SKIP LOCKED
to ensure reliable concurrency control and smooth job processing without contention. - 🛠️ Real-time Notifications: Uses PostgreSQL's
LISTEN
andNOTIFY
commands to trigger real-time job status updates. - 👨💼 Batch Processing: Supports large job batches, optimizing enqueueing and dequeuing with minimal overhead.
- ⏳ Graceful Shutdowns: Built-in signal handling ensures safe job processing shutdown without data loss.
Installation
Install PGQueuer via pip:
pip install pgqueuer
Quick Start
Below is a minimal example of how to use PGQueuer to process data.
Step 1: Consumer - Run the Worker
Start a consumer to process incoming jobs as soon as they are enqueued. PGQueuer ensures graceful shutdowns using pre-configured signal handlers.
import asyncpg
from pgqueuer.db import AsyncpgDriver, dsn
from pgqueuer.models import Job
from pgqueuer.qm import QueueManager
async def main() -> QueueManager:
connection = await asyncpg.connect(dsn())
driver = AsyncpgDriver(connection)
qm = QueueManager(driver)
@qm.entrypoint("fetch")
async def process_message(job: Job) -> None:
print(f"Processed message: {job}")
return qm
Run the consumer:
pgq run examples.consumer.main
Step 2: Producer - Add Jobs to Queue
Now, produce jobs that will be processed by the consumer. Below is a simple script to enqueue 10,000 jobs.
import asyncio
import asyncpg
from pgqueuer.db import AsyncpgDriver
from pgqueuer.queries import Queries
async def main(N: int) -> None:
connection = await asyncpg.connect()
driver = AsyncpgDriver(connection)
queries = Queries(driver)
await queries.enqueue(
["fetch"] * N,
[f"this is from me: {n}".encode() for n in range(1, N + 1)],
[0] * N,
)
if __name__ == "__main__":
asyncio.run(main(10000))
Run the producer:
python3 examples/producer.py 10000
Dashboard
Monitor job processing statistics in real-time using the built-in dashboard:
pgq dashboard --interval 10 --tail 25 --table-format grid
This provides a real-time, refreshing view of job queues and their status.
Example output:
+---------------------------+-------+------------+--------------------------+------------+----------+
| Created | Count | Entrypoint | Time in Queue (HH:MM:SS) | Status | Priority |
+---------------------------+-------+------------+--------------------------+------------+----------+
| 2024-05-05 16:44:26+00:00 | 49 | sync | 0:00:01 | successful | 0 |
...
+---------------------------+-------+------------+--------------------------+------------+----------+
Why Choose PGQueuer?
- Built for Scale: Handles thousands of jobs per second, making it ideal for high-throughput applications.
- PostgreSQL Native: Utilizes advanced PostgreSQL features for robust job handling.
- Flexible Concurrency: Offers rate and concurrency limiting to cater to different use-cases, from bursty workloads to critical resource-bound tasks.
Community & Support
Join our Discord Community to discuss PGQueuer, ask questions, and get support from other developers.
License
PGQueuer is MIT licensed. See LICENSE for more information.
Ready to supercharge your workflows? Install PGQueuer today and take your job management to the next level!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file pgqueuer-0.14.0.tar.gz
.
File metadata
- Download URL: pgqueuer-0.14.0.tar.gz
- Upload date:
- Size: 191.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 11e45127554adef298d4496ef7269b9ac06fa0da1a2e3adba2ea543dfaf31c3c |
|
MD5 | d7d81eb774917840c9fdf5734de4b3ae |
|
BLAKE2b-256 | 84b08dd686ade9fd26bae97477abb9bab1df0cec5a874d1347c492d3f71580c5 |
Provenance
The following attestation bundles were made for pgqueuer-0.14.0.tar.gz
:
Publisher:
release.yml
on janbjorge/pgqueuer
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
pgqueuer-0.14.0.tar.gz
- Subject digest:
11e45127554adef298d4496ef7269b9ac06fa0da1a2e3adba2ea543dfaf31c3c
- Sigstore transparency entry: 146810781
- Sigstore integration time:
- Predicate type:
File details
Details for the file pgqueuer-0.14.0-py3-none-any.whl
.
File metadata
- Download URL: pgqueuer-0.14.0-py3-none-any.whl
- Upload date:
- Size: 37.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f7cd15a78727001342faf61a2d14c2dfb74a4ffe4c2e2ebd48120fc27b441013 |
|
MD5 | 23ff6173667987ea70e178da5aeaed50 |
|
BLAKE2b-256 | 61c555c6a9262f46ebd99975ca9c987ed5fdfa8f3f1f37ad0033baaa1359e452 |
Provenance
The following attestation bundles were made for pgqueuer-0.14.0-py3-none-any.whl
:
Publisher:
release.yml
on janbjorge/pgqueuer
-
Statement type:
https://in-toto.io/Statement/v1
- Predicate type:
https://docs.pypi.org/attestations/publish/v1
- Subject name:
pgqueuer-0.14.0-py3-none-any.whl
- Subject digest:
f7cd15a78727001342faf61a2d14c2dfb74a4ffe4c2e2ebd48120fc27b441013
- Sigstore transparency entry: 146810782
- Sigstore integration time:
- Predicate type: