Skip to main content

brq

Project description

codecov

This project is inspired by arq. Not intentionally dividing the community, I desperately needed a redis queue based on redis stream for work reasons and just decided to open source it.

You should also consider arq as more of a library: https://github.com/samuelcolvin/arq/issues/437

brq

Architecture.png

Prerequisites

Redis >= 6.2, tested with latest redis 6/7 docker image. Recommended to use redis>=7, which includes more inspection features.

Install

pip install brq

Feature

See examples for running examples.

  • Defer job and automatic retry error job
  • Dead queue for unprocessable job, you can process it later
  • Multiple consumers in one consumer group
  • No scheduler needed, consumer handles itself

Echo job overview

Producer

import os

from brq.producer import Producer
from brq.tools import get_redis_client, get_redis_url


async def main():
    redis_url = get_redis_url(
        host=os.getenv("REDIS_HOST", "localhost"),
        port=int(os.getenv("REDIS_PORT", 6379)),
        db=int(os.getenv("REDIS_DB", 0)),
        cluster=bool(os.getenv("REDIS_CLUSTER", "false") in ["True", "true", "1"]),
        tls=bool(os.getenv("REDIS_TLS", "false") in ["True", "true", "1"]),
        username=os.getenv("REDIS_USERNAME", ""),
        password=os.getenv("REDIS_PASSWORD", ""),
    )
    async with get_redis_client(redis_url) as async_redis_client:
        await Producer(async_redis_client).run_job("echo", ["hello"])


if __name__ == "__main__":
    import asyncio

    asyncio.run(main())

Consumer

import os

from brq.consumer import Consumer
from brq.daemon import Daemon
from brq.tools import get_redis_client, get_redis_url


async def echo(message):
    print(message)


async def main():
    redis_url = get_redis_url(
        host=os.getenv("REDIS_HOST", "localhost"),
        port=int(os.getenv("REDIS_PORT", 6379)),
        db=int(os.getenv("REDIS_DB", 0)),
        cluster=bool(os.getenv("REDIS_CLUSTER", "false") in ["True", "true", "1"]),
        tls=bool(os.getenv("REDIS_TLS", "false") in ["True", "true", "1"]),
        username=os.getenv("REDIS_USERNAME", ""),
        password=os.getenv("REDIS_PASSWORD", ""),
    )
    async with get_redis_client(redis_url) as async_redis_client:
        daemon = Daemon(Consumer(async_redis_client, echo))
        await daemon.run_forever()


if __name__ == "__main__":
    import asyncio

    asyncio.run(main())

Technical details: deferred jobs

We can use defer_until as a datetime or defer_hours+defer_minutes+defer_seconds to calculate a timestamp based on current redis timestamp. And use unique to set the job to be unique or not.

By default, unique=True means Job with the exactly same function_name, args and kwargs will be unique, which allows the same Job to add into the deferred queue more than once. In this case, we differentiate tasks by the current redis timestamp(Job.create_at) and an additional uuid(Job.uid), just like redis stream did.

If unique=False, the same Job will be added into the deferred queue only once. Duplicates will update the job's defer time. In this case, you can use your own uuid in args(or kwargs) to differentiate Job.

Develop

Install pre-commit before commit

pip install pre-commit
pre-commit install

Install package locally

pip install -e .[test]

Run unit-test before PR

pytest -v

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

brq-0.3.0.tar.gz (76.5 kB view details)

Uploaded Source

Built Distribution

brq-0.3.0-py3-none-any.whl (17.5 kB view details)

Uploaded Python 3

File details

Details for the file brq-0.3.0.tar.gz.

File metadata

  • Download URL: brq-0.3.0.tar.gz
  • Upload date:
  • Size: 76.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for brq-0.3.0.tar.gz
Algorithm Hash digest
SHA256 0d27f2bd7cb8726250009fda329fd980294b91b009ac937348707688aa5de1ac
MD5 47e6dc109721dc360c3da53b43920ed3
BLAKE2b-256 9a48b03f7111363156ca7013601e84ed40a9bf2c4ce0c39d3277693e4ffdf0fa

See more details on using hashes here.

File details

Details for the file brq-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: brq-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 17.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for brq-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9e6222a3ea8c5a78347ee26495c51bee5db93b148e846204bf6deeaef92c2d8e
MD5 4230c675134a90efd97ca2780bc599c7
BLAKE2b-256 21fe2bb0a54a59b333f7296ef4e95a2f16bf625bcc123b569fd5ac74c9fabc73

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page