Custom logging handler for AskLora projects
Project description
LORA Logger
This package contains both the customised handler for saving the logs into a elasticsearch database, and a factory for creating customised loggers that can use that handler.
The customised handler will forward the logs to an existing logging service through our own celery service. This logging service will handle the logs and input them into the database. This is done to unionise the logs into a single database.
Diagram
flowchart LR
services["Services\n<small>All backend projects\nthat need the logging\nsystem</small>"]-->producer[[Producer]]
subgraph "LoraLogger package"
producer-->queue[Queue]
end
queue-->consumer[[Consumer]]
subgraph "AskLora logger service"
consumer-->database[(<small>ElasticSearch\nDatabase</small>)]
end
How to use
Currently, this package exports a logging handler. Loggers with this handler will be automatically send the records to the elasticsearch server set using the environment variable.
Package installation
there are two ways to install this pacakge
- install the package locally. first, build the project:
poetry build
then you can install using pippip install /path/to/logger/dist/loralogger-0.2.0-py3-none-any.whl
- Install the package from pip
pip install loralogger
Using this package
First, set these environment variables:
BROKER_URL=<your-rabbitmq-endpoint>
RESULT_BACKEND=<your-redis-endpoint>
Then you can use the logger in two different ways:
-
Use the logger factory
-
import the logger factory
from loralogger import LoraLogger
-
create a logger instance, the logger name should point to the Elasticsearch index name you want to send the logs into, with the word "-logs" appended to it (this, for instance, will send the logs to
backend-logs
index)test_logger = LoraLogger.get_logger('backend')
-
use the logger
test_logger.warning("Careful!")
-
-
Use the handler directly to your own logger instance:
-
import the handler
from loralogger import LogToESHandler
-
initialise logging instance
backend_logger = logging.getLogger("backend")
-
Create the handler instance, same as the above, the label should point to an existing Elasticsearch index
handler = LogToESHandler(label="backend")
-
add the handler instance to the logger
backend_logger.addHandler(handler)
-
Use the logger
backend_logger.info("This is an info")
-
Notes
- if the pip installation fails, check this link https://github.com/celery/librabbitmq/issues/131#issuecomment-661884151
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file loralogger-0.2.1.tar.gz
.
File metadata
- Download URL: loralogger-0.2.1.tar.gz
- Upload date:
- Size: 4.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.2.0b2 CPython/3.8.10 Linux/5.15.0-41-generic
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8db916e2c752f632e0d78308f78b262f28c5969f566bbea903df8506efd2def8 |
|
MD5 | d670af9207e4b53c32b0e75910699a51 |
|
BLAKE2b-256 | f24cdad98bec1ada835b2e3ae5e6836803a11b182fcf0df7d727c7007fd6e8df |
File details
Details for the file loralogger-0.2.1-py3-none-any.whl
.
File metadata
- Download URL: loralogger-0.2.1-py3-none-any.whl
- Upload date:
- Size: 5.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.2.0b2 CPython/3.8.10 Linux/5.15.0-41-generic
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | eb7c74b262248d3c765005fcc28cf0e756c48844c314f04e10f0b437dfe1dd34 |
|
MD5 | b2735ce89258351b71df49e0d6136f1e |
|
BLAKE2b-256 | 56c983405d9ba6b20fc530f0a48ed23c1448810e1b0154497a69e1039a7e9ec3 |