Skip to main content

Redis integration for taskiq

Project description

TaskIQ-Redis

Taskiq-redis is a plugin for taskiq that adds a new broker and result backend based on redis.

Installation

To use this project you must have installed core taskiq library:

pip install taskiq

This project can be installed using pip:

pip install taskiq-redis

Usage

Let's see the example with the redis broker and redis async result:

# broker.py
import asyncio

from taskiq_redis import ListQueueBroker, RedisAsyncResultBackend

redis_async_result = RedisAsyncResultBackend(
    redis_url="redis://localhost:6379",
)

# Or you can use PubSubBroker if you need broadcasting
broker = ListQueueBroker(
    url="redis://localhost:6379",
    result_backend=redis_async_result,
)


@broker.task
async def best_task_ever() -> None:
    """Solve all problems in the world."""
    await asyncio.sleep(5.5)
    print("All problems are solved!")


async def main():
    task = await best_task_ever.kiq()
    print(await task.wait_result())


if __name__ == "__main__":
    asyncio.run(main())

Launch the workers: taskiq worker broker:broker Then run the main code: python3 broker.py

PubSubBroker and ListQueueBroker configuration

We have two brokers with similar interfaces, but with different logic. The PubSubBroker uses redis' pubsub mechanism and is very powerful, but it executes every task on all workers, because PUBSUB broadcasts message to all subscribers.

If you want your messages to be processed only once, please use ListQueueBroker. It uses redis' LPUSH and BRPOP commands to deal with messages.

Brokers parameters:

  • url - url to redis.
  • task_id_generator - custom task_id genertaor.
  • result_backend - custom result backend.
  • queue_name - name of the pub/sub channel in redis.
  • max_connection_pool_size - maximum number of connections in pool.

RedisAsyncResultBackend configuration

RedisAsyncResultBackend parameters:

  • redis_url - url to redis.
  • keep_results - flag to not remove results from Redis after reading.
  • result_ex_time - expire time in seconds (by default - not specified)
  • result_px_time - expire time in milliseconds (by default - not specified)

IMPORTANT: It is highly recommended to use expire time ​​in RedisAsyncResultBackend If you want to add expiration, either result_ex_time or result_px_time must be set.

# First variant
redis_async_result = RedisAsyncResultBackend(
   redis_url="redis://localhost:6379",
   result_ex_time=1000,
)

# Second variant
redis_async_result = RedisAsyncResultBackend(
   redis_url="redis://localhost:6379",
   result_px_time=1000000,
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

taskiq_redis-0.5.4.tar.gz (7.7 kB view details)

Uploaded Source

Built Distribution

taskiq_redis-0.5.4-py3-none-any.whl (8.5 kB view details)

Uploaded Python 3

File details

Details for the file taskiq_redis-0.5.4.tar.gz.

File metadata

  • Download URL: taskiq_redis-0.5.4.tar.gz
  • Upload date:
  • Size: 7.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.11.6 Linux/6.2.0-1016-azure

File hashes

Hashes for taskiq_redis-0.5.4.tar.gz
Algorithm Hash digest
SHA256 26d9f661b35d0fe623bc1420cd645ce0ef4a64fbee5a5d252d38a2a4d340a36d
MD5 0d3bd7846d43fdf4fa3310788dddce29
BLAKE2b-256 4b7f17defd9fc1b245d54fdab5e2533872511cb7d809cc507cd566573feabaa1

See more details on using hashes here.

File details

Details for the file taskiq_redis-0.5.4-py3-none-any.whl.

File metadata

  • Download URL: taskiq_redis-0.5.4-py3-none-any.whl
  • Upload date:
  • Size: 8.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.11.6 Linux/6.2.0-1016-azure

File hashes

Hashes for taskiq_redis-0.5.4-py3-none-any.whl
Algorithm Hash digest
SHA256 ad0cce210d1906955e1142fd539b061d7fe884fccb481a5196713aba60f5332f
MD5 668fba8da7f6df0a8775148279fa61d9
BLAKE2b-256 51b677cae006f01541fd9ba682376479a85dae9c0ccb7ffe8bef6e071696526a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page