Skip to main content

brq

Project description

This project is inspired by arq. Not intentionally dividing the community, I desperately needed a redis queue based on redis stream for work reasons and just decided to open source it.

You should also consider arq as more of a library: https://github.com/samuelcolvin/arq/issues/437

brq

Architecture.png

Prerequisites

Redis >= 6.2, tested with latest redis 6/7 docker image

Install

pip install brq

Feature

See examples for running examples.

  • Defer job
  • Automatic retry job
  • Dead queue
  • Multiple consumers

Echo job overview

Producer

import os

from brq.producer import Producer
from brq.tools import get_redis_client, get_redis_url


async def main():
    redis_url = get_redis_url(
        host=os.getenv("REDIS_HOST", "localhost"),
        port=int(os.getenv("REDIS_PORT", 6379)),
        db=int(os.getenv("REDIS_DB", 0)),
        cluster=bool(os.getenv("REDIS_CLUSTER", "false") in ["True", "true", "1"]),
        tls=bool(os.getenv("REDIS_TLS", "false") in ["True", "true", "1"]),
        username=os.getenv("REDIS_USERNAME", ""),
        password=os.getenv("REDIS_PASSWORD", ""),
    )
    async with get_redis_client(redis_url) as async_redis_client:
        await Producer(async_redis_client).run_job("echo", ["hello"])


if __name__ == "__main__":
    import asyncio

    asyncio.run(main())

Consumer

import os

from brq.consumer import Consumer
from brq.daemon import Daemon
from brq.tools import get_redis_client, get_redis_url


async def echo(message):
    print(message)


async def main():
    redis_url = get_redis_url(
        host=os.getenv("REDIS_HOST", "localhost"),
        port=int(os.getenv("REDIS_PORT", 6379)),
        db=int(os.getenv("REDIS_DB", 0)),
        cluster=bool(os.getenv("REDIS_CLUSTER", "false") in ["True", "true", "1"]),
        tls=bool(os.getenv("REDIS_TLS", "false") in ["True", "true", "1"]),
        username=os.getenv("REDIS_USERNAME", ""),
        password=os.getenv("REDIS_PASSWORD", ""),
    )
    async with get_redis_client(redis_url) as async_redis_client:
        daemon = Daemon(Consumer(async_redis_client, echo))
        await daemon.run_forever()


if __name__ == "__main__":
    import asyncio

    asyncio.run(main())

Develop

Install pre-commit before commit

pip install pre-commit
pre-commit install

Install package locally

pip install -e .[test]

Run unit-test before PR

pytest -v

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

brq-0.1.6.tar.gz (70.9 kB view details)

Uploaded Source

Built Distribution

brq-0.1.6-py3-none-any.whl (11.8 kB view details)

Uploaded Python 3

File details

Details for the file brq-0.1.6.tar.gz.

File metadata

  • Download URL: brq-0.1.6.tar.gz
  • Upload date:
  • Size: 70.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.4

File hashes

Hashes for brq-0.1.6.tar.gz
Algorithm Hash digest
SHA256 0e320ba749e8404bd9b8aefbd68f680396f4d9a939e82fd3efc03c60fc13598b
MD5 34c65c2cc0846672e8e797fbf64ec712
BLAKE2b-256 7501fa9455c6a2fa45c9aadf17bf170b5ce99cc0222ad5949797300425097a08

See more details on using hashes here.

File details

Details for the file brq-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: brq-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 11.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.4

File hashes

Hashes for brq-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 c482f43b3712f696b9f64a28bcfc788427d9a11c48f8052d95fb60a55959817c
MD5 3444687c851ba7d07b51c63d5075cd9e
BLAKE2b-256 0ab73bcc87db26844792843f1faba0b52a7c1f5ac4cd2fc88725859388c61373

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page