Background jobs for Python. Powered by the Postgres you already have. Nothing else to maintain.
Project description
Soniq
Background jobs for Python. Powered by the Postgres you already have. Nothing else to maintain.
Quickstart
pip install soniq
# jobs.py
import asyncio
from soniq import Soniq
app = Soniq(database_url="postgresql://localhost/myapp")
@app.job()
async def send_welcome(to: str):
print(f"Sending welcome email to {to}")
if __name__ == "__main__":
asyncio.run(app.enqueue(send_welcome, to="dev@example.com"))
soniq setup # one-time: create tables
SONIQ_JOBS_MODULES=jobs soniq worker --concurrency 4 # run a worker
python jobs.py # enqueue
Four steps. Define a job, set up the database, run a worker, enqueue. SONIQ_JOBS_MODULES tells the worker which modules to import so it can find your @app.job definitions.
Transactional enqueue
The reason most teams choose a Postgres-backed queue. Enqueue a job inside the same transaction as your business writes - if the transaction rolls back, the job never existed:
# Borrow a connection from Soniq's asyncpg pool. Any active asyncpg
# connection works here; it does not have to be Soniq's pool. If your
# app already has its own pool (or a SQLAlchemy session), pass that
# connection instead - see docs/guides/transactional-enqueue.md.
async with app.backend.acquire() as conn:
async with conn.transaction():
# Your business write. The order row only becomes visible once
# this transaction commits.
await conn.execute(
"INSERT INTO orders (id, total) VALUES ($1, $2)",
order_id, total,
)
# Same connection -> same transaction. The job row goes into
# soniq_jobs as part of *this* COMMIT, not a separate one.
# connection=conn is the only thing that differs from a normal
# enqueue() call.
await app.enqueue(
send_invoice,
connection=conn,
order_id=order_id,
)
# If anything inside this `with` block raises, both writes
# roll back together. The order is never created without the
# follow-up job, and the job is never created for an order
# that does not exist.
No Redis-backed queue can do this - their writes happen on a different system, so you need an outbox table and a drain process to keep them in sync. Soniq's job table lives in your Postgres, so a single transaction covers both.
Soniq is at-least-once, not exactly-once: a worker can crash after running your handler but before marking the row done, and the heartbeat sweep will requeue it. Handlers should be idempotent. See docs/guides/cross-service-jobs.md for the full delivery-semantics details.
Why Soniq
Most Python job queues force you to run Redis or RabbitMQ alongside your database. That is another service to deploy, monitor, back up, and debug when things go wrong at 3am.
Soniq uses your existing PostgreSQL. One dependency. One place your data lives. One thing to back up.
| Feature | Soniq | Celery | RQ |
|---|---|---|---|
| No Redis / broker dependency | Yes | No | No |
| Async native | Yes | Partial | No |
| Transactional enqueue | Yes | No | No |
| Setup complexity | Low | High | Medium |
| Built-in dashboard | Yes | No (Flower) | No |
| Dead-letter queue | Yes | No | No |
When NOT to use Soniq
- You need 10k+ jobs/sec sustained throughput. PostgreSQL row locking has limits. Redis-backed queues like Celery or Arq are built for this.
- You need cross-language workers. Soniq is Python-only. If your workers are in Go or Node, use RabbitMQ or similar.
- You are not using PostgreSQL. The production backend requires PostgreSQL.
- You need DAG-based workflow orchestration. Soniq runs individual jobs, not pipelines. Look at Prefect or Airflow.
Features
- Retries with backoff - configurable delays, exponential backoff, per-attempt delay lists
- Dead-letter queue - failed jobs preserved for inspection and manual replay
- Job priorities - lower number = higher priority, processed first
- Scheduled jobs - run at a specific time or after a delay
- Recurring jobs - cron-based recurring schedules with
@app.periodic(cron="0 * * * *") - Transactional enqueue - atomic with your database writes
- Multiple queues - route jobs by type, run dedicated workers per queue
- Middleware hooks -
before_job,after_job,on_errorfor logging, metrics, tracing - Worker heartbeat - auto-detect crashed workers, requeue their jobs
- Deduplication - prevent duplicate jobs with
dedup_keyorunique=True - CLI + dashboard -
setup,worker,scheduler,status,inspect, dead-letter management; web UI
Dashboard
A built-in web dashboard for inspecting jobs, queues, and recent failures. Read-only by default; opt in to retry/cancel/delete actions with SONIQ_DASHBOARD_WRITE_ENABLED=true (which also requires SONIQ_DASHBOARD_API_KEY as a safety interlock).
pip install "soniq[dashboard]"
soniq dashboard # binds 127.0.0.1:6161
Install extras
pip install soniq # core + scheduler + Prometheus sink (Postgres backend)
pip install soniq[full] # everything below
pip install soniq[dashboard] # web dashboard (FastAPI + uvicorn)
pip install soniq[webhooks] # webhook delivery + signing
pip install soniq[logging] # structlog integration
The default install is batteries-included: croniter (so @periodic and the recurring scheduler work out of the box) and prometheus_client (so PrometheusMetricsSink is importable) ship with core. They stay dormant unless wired - the scheduler only runs if you start it, and the default MetricsSink is NoopMetricsSink.
Documentation
- Quickstart
- Tutorial: defining jobs
- FastAPI integration
- Going to production
- Deployment
- CLI reference
- API reference
For AI coding agents
AGENTS.md- canonical patterns, anti-patterns, and the four mistakes agents most often make.docs/llms.txt- curated index of the canonical pages, following the llms.txt convention.docs/llms-full.txt- the six canonical pages concatenated for one-shot context loading.
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file soniq-0.0.2.tar.gz.
File metadata
- Download URL: soniq-0.0.2.tar.gz
- Upload date:
- Size: 129.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e5b04c718e93ccbcaa44d8cfaa766801f184f793b07fd59b77dddeef22345adc
|
|
| MD5 |
21c31b259355e88f14be83e30f2bb6d0
|
|
| BLAKE2b-256 |
746d99f15bcd6dc58f6c622a9e50c5026b16d59caaf61834fff18e4f8a519f46
|
Provenance
The following attestation bundles were made for soniq-0.0.2.tar.gz:
Publisher:
publish.yml on abhinavs/soniq
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
soniq-0.0.2.tar.gz -
Subject digest:
e5b04c718e93ccbcaa44d8cfaa766801f184f793b07fd59b77dddeef22345adc - Sigstore transparency entry: 1433772764
- Sigstore integration time:
-
Permalink:
abhinavs/soniq@5f463159f803585c65ac1c8d41b0d9c5ea28cbb5 -
Branch / Tag:
refs/tags/v0.0.2 - Owner: https://github.com/abhinavs
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5f463159f803585c65ac1c8d41b0d9c5ea28cbb5 -
Trigger Event:
push
-
Statement type:
File details
Details for the file soniq-0.0.2-py3-none-any.whl.
File metadata
- Download URL: soniq-0.0.2-py3-none-any.whl
- Upload date:
- Size: 146.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
759632a3c07872c5994d96912447d386ea464eb77cf35d201e174e541e9430f9
|
|
| MD5 |
d2b1b8b94ed0eff289e6015b7a3edb42
|
|
| BLAKE2b-256 |
1688b4755b06ab416273c5dd98a0c2cefa45e6f7b6481ce735c8fd5bab64a8d8
|
Provenance
The following attestation bundles were made for soniq-0.0.2-py3-none-any.whl:
Publisher:
publish.yml on abhinavs/soniq
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
soniq-0.0.2-py3-none-any.whl -
Subject digest:
759632a3c07872c5994d96912447d386ea464eb77cf35d201e174e541e9430f9 - Sigstore transparency entry: 1433772829
- Sigstore integration time:
-
Permalink:
abhinavs/soniq@5f463159f803585c65ac1c8d41b0d9c5ea28cbb5 -
Branch / Tag:
refs/tags/v0.0.2 - Owner: https://github.com/abhinavs
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5f463159f803585c65ac1c8d41b0d9c5ea28cbb5 -
Trigger Event:
push
-
Statement type: