A log handler for the Python logging module, emitting logs to Kafka topics.
Project description
Kafka Python Log Handler
Handler for the standard logging
module which puts logs through to Kafka.
The current implementation is very basic to accommodate our needs, but additional functionality may be coming when the parent project grows.
Install
This was developed in Python 3.6.7, using kafka-python
1.4.3 and Kafka 2.12
Development
Tests will try to connect to a Kafka instance via the defaults.
Optionally setting KAFKA_HOST
and KAFKA_PORT
to the appropriate location on your machine will change this.
Install the dev requirements
$ pip install -r requirements-dev.txt
Set the pre-commit hook
$ pre-commit install
How to use
The only necessary thing to create this handler is a topic
to push the values to.
Optionally, a key
and/or partition
may be provided.
Any additional keyword-value configuration provided to the Handler will be used to initialize the kafka.KafkaProducer
.
To add a handler to the python logger is very simple:
import logging
from kafka_handler import KafkaLogHandler
handler = KafkaLogHandler(topic="example_topic", key="example_key") # Default parameters for Kafka connection are used
logger = logging.getLogger() # No name gives you the root logger
logger.setLevel("WARNING")
logger.addHandler(example_handler)
logger.warning("This will push this message to the 'example_topic' in Kafka.")
Configuring Kafka Connection
By default each handler will create a kafka.KafkaProducer
instance, passing on each argument from their __init__(**kwargs)
to the KafkaProducer's instantiation.
This means you can configure the connection as specific as you'd like, but every argument should be provided with its keyword; Handler(host="localhost")
instead of Handler("localhost")
.
All available configuration options are available in the kafka-python documentation.
handler = KafkaLogHandler(topic="topic", bootstrap_servers="other_host:9092", retries=0)
Configure message logging
Every handler has the raw_logging
option which can be provided optionally.
Omitting it from the initialisation, will default it to False
, meaning the message being logged will be purely what's sent.
If you set it to True
, first the content will be logged, then appended to the line number and finally the pathname.
raw_handler = KafkaLogHandler(topic="topic", raw_logging=True)
...
logging.info("Test message")
A pure handler would emit a message like so: Test message.
,
the raw_handler
however, will emit a message like so: Test message. - 2: /.../file.py
.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file kafka-python-log-handler-0.0.1.dev12.tar.gz
.
File metadata
- Download URL: kafka-python-log-handler-0.0.1.dev12.tar.gz
- Upload date:
- Size: 6.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0b748dffcea2a4ce4d1aaba1dded602925ade0a7168d3a096e67ffee7a9589a0 |
|
MD5 | 96b1e988696d21e9b6f1262ebcfa0b41 |
|
BLAKE2b-256 | 4c564c94f82255b834b2f4ee9eadd90c1c3d87f48c1505a11608849288c5699c |
File details
Details for the file kafka_python_log_handler-0.0.1.dev12-py2.py3-none-any.whl
.
File metadata
- Download URL: kafka_python_log_handler-0.0.1.dev12-py2.py3-none-any.whl
- Upload date:
- Size: 4.7 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b9379b7ae47ba72ac4f46ff6a5d14266a4a8fa5c0ff244223e479996948ed9aa |
|
MD5 | 61e9be0572b934887b018f90e1dde5d6 |
|
BLAKE2b-256 | a7ba96c58e14b9695e2e6e82ca926008085fb27a8dfc184267e855cb1c59453f |