Skip to main content

brq

Project description

This project is inspired by arq. Not intentionally dividing the community, I desperately needed a redis queue based on redis stream for work reasons and just decided to open source it.

You should also consider arq as more of a library: https://github.com/samuelcolvin/arq/issues/437

brq

Architecture.png

Prerequisites

Redis >= 6.2, tested with latest redis 6/7 docker image

Install

pip install brq

Feature

See examples for running examples.

  • Defer job
  • Automatic retry job
  • Dead queue
  • Multiple consumers

Echo job overview

Producer

import os

from brq.producer import Producer
from brq.tools import get_redis_client, get_redis_url


async def main():
    redis_url = get_redis_url(
        host=os.getenv("REDIS_HOST", "localhost"),
        port=int(os.getenv("REDIS_PORT", 6379)),
        db=int(os.getenv("REDIS_DB", 0)),
        cluster=bool(os.getenv("REDIS_CLUSTER", "false") in ["True", "true", "1"]),
        tls=bool(os.getenv("REDIS_TLS", "false") in ["True", "true", "1"]),
        username=os.getenv("REDIS_USERNAME", ""),
        password=os.getenv("REDIS_PASSWORD", ""),
    )
    async with get_redis_client(redis_url) as async_redis_client:
        await Producer(async_redis_client).run_job("echo", ["hello"])


if __name__ == "__main__":
    import asyncio

    asyncio.run(main())

Consumer

import os

from brq.consumer import Consumer
from brq.daemon import Daemon
from brq.tools import get_redis_client, get_redis_url


async def echo(message):
    print(message)


async def main():
    redis_url = get_redis_url(
        host=os.getenv("REDIS_HOST", "localhost"),
        port=int(os.getenv("REDIS_PORT", 6379)),
        db=int(os.getenv("REDIS_DB", 0)),
        cluster=bool(os.getenv("REDIS_CLUSTER", "false") in ["True", "true", "1"]),
        tls=bool(os.getenv("REDIS_TLS", "false") in ["True", "true", "1"]),
        username=os.getenv("REDIS_USERNAME", ""),
        password=os.getenv("REDIS_PASSWORD", ""),
    )
    async with get_redis_client(redis_url) as async_redis_client:
        daemon = Daemon(Consumer(async_redis_client, echo))
        await daemon.run_forever()


if __name__ == "__main__":
    import asyncio

    asyncio.run(main())

Develop

Install pre-commit before commit

pip install pre-commit
pre-commit install

Install package locally

pip install -e .[test]

Run unit-test before PR

pytest -v

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

brq-0.1.4.tar.gz (70.8 kB view details)

Uploaded Source

Built Distribution

brq-0.1.4-py3-none-any.whl (11.8 kB view details)

Uploaded Python 3

File details

Details for the file brq-0.1.4.tar.gz.

File metadata

  • Download URL: brq-0.1.4.tar.gz
  • Upload date:
  • Size: 70.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for brq-0.1.4.tar.gz
Algorithm Hash digest
SHA256 01d4a51b8507c09ae2c9f75d6498e8da0ad9bf09db52f329598a64a4d19651e8
MD5 2c6dde485737c5a86d0dbef8f4ab1c6c
BLAKE2b-256 bb334fd8d88ca9bfa9d7c4b51819d1564ad822855c69eb02c9a86cb118a428cf

See more details on using hashes here.

File details

Details for the file brq-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: brq-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 11.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for brq-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 3ed65fe7f176340fc290f3a81e6e79d6a1e9a25c583c43eacf92ef5f08787538
MD5 d438a4840b54c8ad9ed5aa1f4733b458
BLAKE2b-256 9fef019b1891f42034a5366a74ffeed338b50cabf93e6e9dab3abce946734ba6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page