Skip to main content

Async-first background job queue for Python, built on SQLite via sqler

Project description

qler

Background jobs without Redis, with first-class debugging.

qler is an async-first background job queue for Python, built on SQLite via sqler.

Install

uv add qler

Quick Start

Define a task, enqueue it, and run a worker:

import asyncio
from qler import Queue, task, Worker

queue = Queue("jobs.db")

@task(queue, max_retries=3)
async def send_email(to: str, subject: str, body: str):
    # your email sending logic here
    print(f"Sending to {to}: {subject}")
    return {"sent": True}

async def main():
    # Enqueue a job
    job = await send_email.enqueue(
        to="user@example.com",
        subject="Hello",
        body="Welcome!",
    )
    print(f"Enqueued job {job.ulid}")

    # Start a worker to process jobs
    worker = Worker(queue, queues=["default"], concurrency=4)
    await worker.run()

asyncio.run(main())

CLI

qler ships with a CLI for managing queues and jobs:

# Initialize a database
qler init --db jobs.db

# Start a worker (--app points to your Queue instance)
qler worker --app myapp.queue --queues default --concurrency 4

# Check queue status
qler status --db jobs.db

# List jobs (with optional filters)
qler jobs --db jobs.db --status failed --limit 10

# Inspect a specific job
qler job <ULID> --db jobs.db

# View attempt history
qler attempts <ULID> --db jobs.db

# Retry failed jobs
qler retry --db jobs.db --all

# Cancel pending jobs
qler cancel --db jobs.db --all

# Purge old completed jobs
qler purge --db jobs.db --older-than 7d

# Health check
qler doctor --db jobs.db

# All commands support --json for machine-readable output
qler status --db jobs.db --json

Testing

qler provides two modes for test-friendly usage:

Immediate Mode

Queue(immediate=True) executes jobs inline during enqueue() — no worker needed:

import asyncio
from qler import Queue, task, JobStatus

async def test_email_task():
    queue = Queue(":memory:", immediate=True)

    @task(queue)
    async def send_email(to: str):
        return {"sent_to": to}

    job = await send_email.enqueue(to="test@example.com")

    assert job.status == JobStatus.COMPLETED.value
    assert job.result == {"sent_to": "test@example.com"}

Direct Execution

task.run_now() calls the function directly without touching the database:

result = await send_email.run_now(to="test@example.com")
assert result == {"sent_to": "test@example.com"}

Configuration

Queue Options

queue = Queue(
    "jobs.db",
    immediate=False,             # Execute inline on enqueue (for testing)
    default_lease_duration=300,   # Worker lease timeout in seconds
    default_max_retries=0,       # Default retry count for tasks
    default_retry_delay=60,      # Base retry delay in seconds (exponential backoff)
    max_payload_size=1_000_000,  # Max payload size in bytes
)

Task Options

@task(
    queue,
    queue_name="emails",     # Route to a specific queue
    max_retries=3,           # Override default retry count
    retry_delay=30,          # Override default retry delay
    priority=10,             # Higher priority = claimed first
    lease_duration=600,      # Override default lease timeout
    sync=True,               # For sync functions (runs via asyncio.to_thread)
)
def cpu_bound_task(data):
    return process(data)

Enqueue Options

job = await my_task.enqueue(
    arg1, arg2,
    _delay=60,                         # Delay execution by N seconds
    _eta=1700000000,                   # Execute at specific epoch timestamp
    _priority=5,                       # Override task default priority
    _idempotency_key="order:123",      # Deduplicate by key
    _correlation_id="req-abc-123",     # Link related jobs for debugging
)

The -ler Ecosystem

Package Purpose
sqler SQLite ORM (qler's storage layer)
qler Background job queue
logler Log aggregation with correlation IDs

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qler-0.5.0.tar.gz (45.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

qler-0.5.0-py3-none-any.whl (52.3 kB view details)

Uploaded Python 3

File details

Details for the file qler-0.5.0.tar.gz.

File metadata

  • Download URL: qler-0.5.0.tar.gz
  • Upload date:
  • Size: 45.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for qler-0.5.0.tar.gz
Algorithm Hash digest
SHA256 8c37b2071493bb562573bb8112dd7aca4ca505e1dda460104fa84fe33176bdfe
MD5 0911c5187e3db992788850ed7fdf3019
BLAKE2b-256 6ada0292a432e5d220fb80bb8d0f3c98aae97b7e500dbf3e6c44b9d1937ab5c6

See more details on using hashes here.

Provenance

The following attestation bundles were made for qler-0.5.0.tar.gz:

Publisher: pypi.yml on gabu-quest/qler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file qler-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: qler-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 52.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for qler-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cdd123409e62a10cefbc52e095c9e8d60aef3527426854bb3b6d9035e859ca3f
MD5 d6ea759f081e4e7f8262cd9287871f84
BLAKE2b-256 c2c906ce2462738705be6c169908955abbafbf064ec95b9083f8ec79dac0a411

See more details on using hashes here.

Provenance

The following attestation bundles were made for qler-0.5.0-py3-none-any.whl:

Publisher: pypi.yml on gabu-quest/qler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page