Skip to main content

ElephantQ - PostgreSQL-only async job queue - built for developer happiness.

Project description

ElephantQ

Background jobs for Python. Powered by the Postgres you already have.

PyPI version Python versions License Tests

Quickstart

pip install elephantq
# jobs.py
from elephantq import ElephantQ

app = ElephantQ(database_url="postgresql://localhost/myapp")

@app.job(max_retries=3)
async def send_welcome(to: str):
    print(f"Sending welcome email to {to}")
# enqueue from anywhere in your app
await app.enqueue(send_welcome, to="dev@example.com")
# set up tables and start processing
elephantq setup
elephantq start --concurrency 4

Four steps. Define a job, enqueue it, set up the database, start a worker.

Local dev without Postgres? Use SQLite: ElephantQ(database_url='local.db'). For production, always use PostgreSQL.

Transactional enqueue

Enqueue a job inside your database transaction. If the transaction rolls back, the job never existed.

async with pool.acquire() as conn:
    async with conn.transaction():
        await conn.execute("INSERT INTO orders ...")
        await app.enqueue(send_invoice, connection=conn, order_id=order_id)
        # Both commit together, or neither does

No Redis queue can do this. Your job and your data land in the same commit. If something fails halfway through, both roll back. No stale jobs, no ghost tasks, no cleanup scripts.

Why ElephantQ

Most Python job queues force you to run Redis or RabbitMQ alongside your database. That's another service to deploy, monitor, back up, and debug when things go wrong at 3am.

ElephantQ uses your existing PostgreSQL. One dependency. One place your data lives. One thing to back up.

Feature ElephantQ Celery RQ
No Redis dependency Yes No No
Async native Yes Partial No
Transactional enq. Yes No No
Setup complexity Low High Medium
Built-in dashboard Yes No (Flower) No
Dead-letter queue Yes No No

Features

  • Retries with backoff -- configurable delays, exponential backoff, per-attempt delay lists
  • Dead-letter queue -- failed jobs preserved for inspection and manual retry
  • Job priorities -- lower number = higher priority, processed first
  • Scheduled jobs -- run at a specific time or after a delay
  • Recurring jobs -- cron-based periodic tasks with @app.periodic(cron="0 * * * *")
  • Transactional enqueue -- atomic with your database writes
  • Multiple queues -- route jobs by type, run dedicated workers per queue
  • Middleware hooks -- before_job, after_job, on_error for logging, metrics, tracing
  • Worker heartbeat -- auto-detect crashed workers, requeue their jobs
  • Job results -- store and retrieve return values from completed jobs
  • Deduplication -- prevent duplicate jobs with dedup_key or unique=True
  • CLI -- setup, start, status, workers, dead-letter management
  • Dashboard -- web UI for monitoring queues, workers, and job state

Dashboard

Monitor queues, workers, retries, and system health from a built-in web UI.

pip install elephantq[dashboard]
elephantq dashboard

ElephantQ Dashboard

Install

pip install elephantq              # core (Postgres backend)
pip install elephantq[full]        # everything below
pip install elephantq[sqlite]      # SQLite backend for local dev
pip install elephantq[scheduling]  # cron-based recurring jobs
pip install elephantq[dashboard]   # web dashboard
pip install elephantq[monitoring]  # Prometheus metrics
pip install elephantq[webhooks]    # webhook delivery + signing

When NOT to use ElephantQ

  • You need 10k+ jobs/sec sustained throughput. PostgreSQL row locking has limits. Redis-backed queues like Celery or Arq are built for this.
  • You need cross-language consumers. ElephantQ is Python-only. If your workers are in Go or Node, use RabbitMQ or a similar broker.
  • You're not using PostgreSQL. The production backend requires PostgreSQL. If your stack is MySQL or MongoDB, this isn't for you.
  • You need DAG-based workflow orchestration. ElephantQ handles individual jobs, not pipelines. Look at Prefect or Airflow.

Documentation

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

soniq-0.0.1.tar.gz (114.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

soniq-0.0.1-py3-none-any.whl (129.2 kB view details)

Uploaded Python 3

File details

Details for the file soniq-0.0.1.tar.gz.

File metadata

  • Download URL: soniq-0.0.1.tar.gz
  • Upload date:
  • Size: 114.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for soniq-0.0.1.tar.gz
Algorithm Hash digest
SHA256 a38daa64bf2691ae2724c8ae1c6ad4e6acc02b8926251833e97f1b6f1c8e6e16
MD5 a12d07afe47f28a8df9e817be05d16a3
BLAKE2b-256 e63bef0f6211efcdb8f7f26dbe55726ecd2f3f72c8885a45190fe2883307f2cb

See more details on using hashes here.

File details

Details for the file soniq-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: soniq-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 129.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for soniq-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e507b42bdcfcb4601b93f06bb49ee6ec6e6dcc1415c68e01c4b7d4ac06dc12fa
MD5 344b713a11be3c14a9fa90679f8db556
BLAKE2b-256 d7ec80a641f44964777e7803c5680a932066b6b069730b367be78d7fb7a57919

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page