Skip to main content

Redis integration for taskiq

Project description

TaskIQ-Redis

Taskiq-redis is a plugin for taskiq that adds a new broker and result backend based on redis.

Installation

To use this project you must have installed core taskiq library:

pip install taskiq

This project can be installed using pip:

pip install taskiq-redis

Usage

Let's see the example with the redis broker and redis async result:

# broker.py
import asyncio

from taskiq_redis import ListQueueBroker, RedisAsyncResultBackend

redis_async_result = RedisAsyncResultBackend(
    redis_url="redis://localhost:6379",
)

# Or you can use PubSubBroker if you need broadcasting
broker = ListQueueBroker(
    url="redis://localhost:6379",
    result_backend=redis_async_result,
)


@broker.task
async def best_task_ever() -> None:
    """Solve all problems in the world."""
    await asyncio.sleep(5.5)
    print("All problems are solved!")


async def main():
    task = await best_task_ever.kiq()
    print(await task.wait_result())


if __name__ == "__main__":
    asyncio.run(main())

Launch the workers: taskiq worker broker:broker Then run the main code: python3 broker.py

PubSubBroker and ListQueueBroker configuration

We have two brokers with similar interfaces, but with different logic. The PubSubBroker uses redis' pubsub mechanism and is very powerful, but it executes every task on all workers, because PUBSUB broadcasts message to all subscribers.

If you want your messages to be processed only once, please use ListQueueBroker. It uses redis' LPUSH and BRPOP commands to deal with messages.

Brokers parameters:

  • url - url to redis.
  • task_id_generator - custom task_id genertaor.
  • result_backend - custom result backend.
  • queue_name - name of the pub/sub channel in redis.
  • max_connection_pool_size - maximum number of connections in pool.

RedisAsyncResultBackend configuration

RedisAsyncResultBackend parameters:

  • redis_url - url to redis.
  • keep_results - flag to not remove results from Redis after reading.
  • result_ex_time - expire time in seconds (by default - not specified)
  • result_px_time - expire time in milliseconds (by default - not specified)

IMPORTANT: It is highly recommended to use expire time ​​in RedisAsyncResultBackend If you want to add expiration, either result_ex_time or result_px_time must be set.

# First variant
redis_async_result = RedisAsyncResultBackend(
   redis_url="redis://localhost:6379",
   result_ex_time=1000,
)

# Second variant
redis_async_result = RedisAsyncResultBackend(
   redis_url="redis://localhost:6379",
   result_px_time=1000000,
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

taskiq_redis-0.4.0.tar.gz (4.6 kB view details)

Uploaded Source

Built Distribution

taskiq_redis-0.4.0-py3-none-any.whl (5.6 kB view details)

Uploaded Python 3

File details

Details for the file taskiq_redis-0.4.0.tar.gz.

File metadata

  • Download URL: taskiq_redis-0.4.0.tar.gz
  • Upload date:
  • Size: 4.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.9.16 Linux/5.15.0-1038-azure

File hashes

Hashes for taskiq_redis-0.4.0.tar.gz
Algorithm Hash digest
SHA256 7b22d2028965878b9a0567b4eaf7ab939fbd985226d0936c1407846e9d5c523d
MD5 133f6958bc0718ff722d004e22660af1
BLAKE2b-256 b3a9d641045fe3ef4e6a76d6af1517233fcf5f2c12abb15f1dc6ee6e1bca9d81

See more details on using hashes here.

File details

Details for the file taskiq_redis-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: taskiq_redis-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 5.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.9.16 Linux/5.15.0-1038-azure

File hashes

Hashes for taskiq_redis-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 61a01b8076a41730c1a1b7aa659f3bf703e4e7eb53a1e049bee02f0fa17177ae
MD5 2780c87ae5caf0d179917fbd76f5510d
BLAKE2b-256 83774e591cf672444327fececebf864d608c0aadfb2c9882c20c63df264f68d6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page