Skip to main content

RabbitMQ integration

Project description

Klein Queue

Module to abstract queues. Currently implements RabbitMQ.

Documentation

Generate API docs for a particular version with pdoc:

pip install pdoc3
pdoc --http :8080 src

Environment Variables

Env Variable Description
RABBITMQ_USERNAME
RABBITMQ_PASSWORD
RABBITMQ_HOST
RABBITMQ_PORT
RABBITMQ_VHOST Use a VHOST instead of default of /
RABBITMQ_SOCKET_TIMEOUT
RABBITMQ_HEARTBEAT
RABBITMQ_BLOCKED_CONNECTION_TIMEOUT
RABBITMQ_RETRY_DELAY
RABBITMQ_PUBLISHER
RABBITMQ_CONSUMER
RABBITMQ_ERROR
RABBITMQ_CREATE_QUEUE_ON_CONNECT Config to determine whether to create queue at connection

Example usage

Consumer

Define the following in a config.yaml file.

rabbitmq:
  host: [localhost]
  port: 5672
  username: guest
  password: guest
  heartbeat: 2
  exchange: 'test_exchange' # You can also define an exchange here if it is used by multiple consumers.
consumer:
  name: test.consumer
  queue: test
  auto_acknowledge: false
  concurrency: 2
  create_on_connect: true
  exchange: test_events_exchange
  exchange_type: headers
  exchange_bind_arguments:
  db: test_db
  coll: test_coll
  x-match: any # default in rabbitMQ is all
publisher:
  queue: test
  create_on_connect: true

Add the following to your main.py file.

from klein_config.config import EnvironmentAwareConfig
from klein_queue.rabbitmq.consumer import Consumer

config = EnvironmentAwareConfig()       # Read from file specified with `--config`
def handler_fn(message, **kwargs):      # handler_fn receives messages from the queue.
    print(message)
consumer = Consumer(config, "consumer", handler_fn)
consumer.start()

Run the following command to start the consumer.

$ python main.py --config config.yaml

Publisher

Using the same config as the consumer, you can create a publisher. Add the following to a python file and run it. It will publish a message to the queue. If you have the consumer running it will print the message to the console.

from klein_config.config import EnvironmentAwareConfig
from klein_queue.rabbitmq.publisher import Publisher

config = EnvironmentAwareConfig()       # Read from file specified with `--config`

publisher = Publisher(config, "publisher")
if __name__ == "__main__":
    publisher.start()                   # spawns the publisher thread
    publisher.add({'id': 'abc123'})     # sends a message

See the tests directory for more examples.

Python

Utilises python 3.11

Virtualenv

virtualenv -p python3.11 venv
source venv/bin/activate
pip install -r requirements.txt

Testing

docker-compose up
python -m pytest

License

This project is licensed under the terms of the Apache 2 license, which can be found in the repository as LICENSE.txt

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

klein_queue-2.4.5.tar.gz (16.3 kB view details)

Uploaded Source

Built Distribution

klein_queue-2.4.5-py3-none-any.whl (20.2 kB view details)

Uploaded Python 3

File details

Details for the file klein_queue-2.4.5.tar.gz.

File metadata

  • Download URL: klein_queue-2.4.5.tar.gz
  • Upload date:
  • Size: 16.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.7

File hashes

Hashes for klein_queue-2.4.5.tar.gz
Algorithm Hash digest
SHA256 6fac4955080e8fe4475c529755807fd61328e29a1a82bd8a44710ec0dc628214
MD5 953641d47248c0c456843eeea99a94fc
BLAKE2b-256 9ef276a61beb476afffa1a9aa298eb77599ac8add1a276f3cfb2d1f7d1e46eda

See more details on using hashes here.

File details

Details for the file klein_queue-2.4.5-py3-none-any.whl.

File metadata

  • Download URL: klein_queue-2.4.5-py3-none-any.whl
  • Upload date:
  • Size: 20.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.7

File hashes

Hashes for klein_queue-2.4.5-py3-none-any.whl
Algorithm Hash digest
SHA256 71444430684487fc271175c14de3adb6459a3f3547ed912418d45e8131af2caf
MD5 f067a92f85285153ece05a6ae934bf56
BLAKE2b-256 29c917be53654b9d6fcc99cf2736bf6f9e24c6ebe6e58bef83c2d1de050a89e1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page