Skip to main content

Log server for RabbitMQ consumer.

Project description

python3-cyberfusion-rabbitmq-consumer-log-server

Log server for RabbitMQ consumer.

Use the RabbitMQ consumer? The log server gives you an overview of RPC requests/responses - from all your RabbitMQ consumers - in one place.

RPC requests overview

RPC request detail

Install

PyPI

Run the following command to install the package from PyPI:

pip3 install python3-cyberfusion-rabbitmq-consumer-log-server

Debian

Run the following commands to build a Debian package:

mk-build-deps -i -t 'apt -o Debug::pkgProblemResolver=yes --no-install-recommends -y'
dpkg-buildpackage -us -uc

Configure

The log server consists of two parts: a web-based GUI, and an API.

API

The RabbitMQ consumer writes RPC requests/responses to the API. To let the consumer ship logs to the log server, see the consumer's README.

In the consumer, an API token must be set. Set it in /etc/rabbitmq-consumer-log-server/api_token (regular text file).

You can generate a random API token using openssl: openssl rand -hex 32

You don't need to keep the API token confidential: it's only used by the RabbitMQ consumer to write logs. Therefore, abuse would be a nuisance at the worst, not a data breach.

GUI

Use the web-based GUI to view RPC requests/responses.

The web GUI uses basic authentication. Set a password in /etc/rabbitmq-consumer-log-server/gui_password (regular text file).

You can generate a random password using openssl: openssl rand -hex 32

You can use any basic authentication username - it's ignored.

Retention

By default, logs are kept for 45 days. To override this, set the environment variable KEEP_DAYS to the number of days.

Periodic tasks

Using the Debian package? You don't have to do anything - periodic tasks are automatically executed using cron.

Otherwise, run the following commands periodically:

  • rabbitmq-consumer-log-server-purge-logs (every 24 hours)

Usage

Run

Manually

  • Run migrations: alembic upgrade head
  • Run the app using an ASGI server such as Uvicorn.

systemd

The server runs on :::4194.

systemctl start rabbitmq-consumer-log-server.service

Web GUI

Once the server is started, access the web GUI on /rpc-requests.

SSL

Use a proxy that terminates SSL. E.g. HAProxy.

When using a reverse proxy, 127.0.0.1 is trusted by default. To override this, set the FORWARDED_ALLOW_IPS environment variable. For more information, see the Uvicorn documentation (look for --proxy-headers and --forwarded-allow-ips).

⚠️ Development

Developing the RabbitMQ consumer? Access the API documentation on /redoc (Redoc) and /docs (Swagger).

Developing the log server? After installing it, and running migrations, seed your database with example data:

python3 seed_example_data.py

You can run the seeder multiple times to fill up your database with more data.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file python3_cyberfusion_rabbitmq_consumer_log_server-1.2.tar.gz.

File metadata

File hashes

Hashes for python3_cyberfusion_rabbitmq_consumer_log_server-1.2.tar.gz
Algorithm Hash digest
SHA256 a5458a242260c3d3c7ef68704086737f55bd5172b7c091a9823f0a413bc61152
MD5 a56f32a76ed1f9f512343532b3879b5d
BLAKE2b-256 e259b69caa89f4b073e17640526a83691808529835b2b4a0a0053206ef2e8e3d

See more details on using hashes here.

File details

Details for the file python3_cyberfusion_rabbitmq_consumer_log_server-1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for python3_cyberfusion_rabbitmq_consumer_log_server-1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 45b5a602ff8251a12c08e5d31ab68d3d07c46d339da7293bb53982504e767b02
MD5 16a9dce8e8bd3cb2fab86b1cf4e133cc
BLAKE2b-256 c485fc75782ef0c3f2d736c3cfb6ea0b8919e7c480365d1ad28c5d600e47de78

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page