Skip to main content

RabbitMQ integration

Project description

Klein Queue

Module to abstract queues. Currently implements RabbitMQ.

Documentation

Generate API docs for a particular version with pdoc:

pip install pdoc3
pdoc --http :8080 src

Environment Variables

Env Variable Description
RABBITMQ_USERNAME
RABBITMQ_PASSWORD
RABBITMQ_HOST
RABBITMQ_PORT
RABBITMQ_VHOST Use a VHOST instead of default of /
RABBITMQ_SOCKET_TIMEOUT
RABBITMQ_HEARTBEAT
RABBITMQ_BLOCKED_CONNECTION_TIMEOUT
RABBITMQ_RETRY_DELAY
RABBITMQ_PUBLISHER
RABBITMQ_CONSUMER
RABBITMQ_ERROR
RABBITMQ_CREATE_QUEUE_ON_CONNECT Config to determine whether to create queue at connection

Example usage

Consumer

Define the following in a config.yaml file.

rabbitmq:
  host: [localhost]
  port: 5672
  username: guest
  password: guest
  heartbeat: 2
  exchange: 'test_exchange' # You can also define an exchange here if it is used by multiple consumers.
consumer:
  name: test.consumer
  queue: test
  auto_acknowledge: false
  concurrency: 2
  create_on_connect: true
  exchange: test_events_exchange
  exchange_type: headers
  exchange_bind_arguments:
  db: test_db
  coll: test_coll
  x-match: any # default in rabbitMQ is all
publisher:
  queue: test
  create_on_connect: true

Add the following to your main.py file.

from klein_config.config import EnvironmentAwareConfig
from klein_queue.rabbitmq.consumer import Consumer

config = EnvironmentAwareConfig()       # Read from file specified with `--config`
def handler_fn(message, **kwargs):      # handler_fn receives messages from the queue.
    print(message)
consumer = Consumer(config, "consumer", handler_fn)
consumer.start()

Run the following command to start the consumer.

$ python main.py --config config.yaml

Publisher

Using the same config as the consumer, you can create a publisher. Add the following to a python file and run it. It will publish a message to the queue. If you have the consumer running it will print the message to the console.

from klein_config.config import EnvironmentAwareConfig
from klein_queue.rabbitmq.publisher import Publisher

config = EnvironmentAwareConfig()       # Read from file specified with `--config`

publisher = Publisher(config, "publisher")
if __name__ == "__main__":
    publisher.start()                   # spawns the publisher thread
    publisher.add({'id': 'abc123'})     # sends a message

See the tests directory for more examples.

Python

Utilises python 3.11

Virtualenv

virtualenv -p python3.11 venv
source venv/bin/activate
pip install -r requirements.txt

Testing

docker-compose up
python -m pytest

License

This project is licensed under the terms of the Apache 2 license, which can be found in the repository as LICENSE.txt

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

klein_queue-2.4.6.tar.gz (16.3 kB view details)

Uploaded Source

Built Distribution

klein_queue-2.4.6-py3-none-any.whl (20.2 kB view details)

Uploaded Python 3

File details

Details for the file klein_queue-2.4.6.tar.gz.

File metadata

  • Download URL: klein_queue-2.4.6.tar.gz
  • Upload date:
  • Size: 16.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.9

File hashes

Hashes for klein_queue-2.4.6.tar.gz
Algorithm Hash digest
SHA256 de2a939a0819e7b57253ec3c4bd2dd6efc3eb9c97ba714c5033e3220e1bb6c05
MD5 5e817b817d790ce894f63cd11e18e54e
BLAKE2b-256 5dde5a429fdfa8f0fdf81691e586e65f0dcb490114de6865b72f59d5ce34b8c7

See more details on using hashes here.

File details

Details for the file klein_queue-2.4.6-py3-none-any.whl.

File metadata

  • Download URL: klein_queue-2.4.6-py3-none-any.whl
  • Upload date:
  • Size: 20.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.9

File hashes

Hashes for klein_queue-2.4.6-py3-none-any.whl
Algorithm Hash digest
SHA256 321f0b211216deb60b73de2a31dfb0ecc0e888f7d15d74c8bed79ad2a5c36bf3
MD5 28e3a5a8f98dca338a9c0915f60b1ac5
BLAKE2b-256 2e31da61677231ab54af6e8842ecae6f07ef76dd72386b2102fb4ab226e6af4d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page