Skip to main content

TSV-based job queue for async tasks

Project description

dbbasic-queue

TSV-based job queue for async tasks. Simple, reliable background job processing.

Philosophy

"Store work, not workers. Queue jobs, not processes."

Background jobs are actual work to be done, not temporary state. Unlike sessions (which are ephemeral authentication), jobs need persistent storage, retry logic, and failure handling.

Features

  • Simple: ~50 lines of core code
  • Reliable: Retry logic with exponential backoff
  • Debuggable: Plain text TSV, inspect with cat/grep
  • Unix-Compatible: Cron-based workers, no daemon required
  • Foundation-First: Built on dbbasic-tsv

Installation

pip install dbbasic-queue

Quick Start

1. Queue a job

from dbbasic_queue import enqueue

# Queue an email to be sent
job_id = enqueue('send_email', {
    'to': 'user@example.com',
    'subject': 'Welcome',
    'body': 'Thanks for signing up!'
})

2. Create a worker

# workers/queue_worker.py
from dbbasic_queue import process_jobs

def send_email_handler(payload):
    """Send email via SMTP"""
    # ... send email logic
    return {'sent_at': time.time()}

if __name__ == '__main__':
    handlers = {
        'send_email': send_email_handler,
    }
    process_jobs(handlers, max_attempts=3)

3. Set up cron

# Run worker every minute
* * * * * cd /app && python3 workers/queue_worker.py >> /var/log/queue.log 2>&1

API Reference

enqueue(job_type, payload, run_at=None)

Add job to queue.

  • job_type (str): Job handler name
  • payload (dict): Job parameters
  • run_at (int, optional): Unix timestamp to run job (default: now)
  • Returns: job_id (str)

process_jobs(handlers, max_attempts=3)

Process pending jobs (run by worker).

  • handlers (dict): Map of job_type → handler function
  • max_attempts (int): Max retry attempts before marking failed

get_job(job_id)

Get job status and details.

  • job_id (str): Job identifier
  • Returns: job (dict) or None

cancel_job(job_id)

Cancel pending job.

  • job_id (str): Job identifier
  • Returns: bool (True if cancelled)

Storage Format

Jobs are stored in a single TSV file: data/queue.tsv

id  type    payload status  created_at  run_at  attempts    error   result

Performance

  • Enqueue job: 0.1ms
  • Process job: 0.5ms
  • Perfect for single-server apps with < 10K jobs

When to Use

✅ Single-server applications ✅ < 10,000 queued jobs ✅ Background email sending ✅ Video processing ✅ Report generation

When to Graduate to Redis/Celery

  • 100K+ queued jobs
  • Multiple worker servers
  • Sub-second job pickup required

License

MIT

Full Specification

See http://dbbasic.com/queue-spec for complete specification.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dbbasic_queue-1.0.0.tar.gz (10.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dbbasic_queue-1.0.0-py3-none-any.whl (6.1 kB view details)

Uploaded Python 3

File details

Details for the file dbbasic_queue-1.0.0.tar.gz.

File metadata

  • Download URL: dbbasic_queue-1.0.0.tar.gz
  • Upload date:
  • Size: 10.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.1

File hashes

Hashes for dbbasic_queue-1.0.0.tar.gz
Algorithm Hash digest
SHA256 19a481d8a6cd21afdca1149c17b81dd180ffc9402fb767c6b12df1e6abc2e698
MD5 3218da5511e94956d20fb1dc5f32a560
BLAKE2b-256 0239f673315a37c3fa18dad5cb28108372b73437052ea40231ec30b3733220a1

See more details on using hashes here.

File details

Details for the file dbbasic_queue-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: dbbasic_queue-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 6.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.1

File hashes

Hashes for dbbasic_queue-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 49757f07cc7e20953253c0c12e87bfdc2bf2735942b766568747b772f6de9df6
MD5 67ad0ba478453264b49d91b802051298
BLAKE2b-256 1e9640624820ed007c4d879f5e5e53ed0b29dc6fd4d6e5689951aa00973007ca

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page