Skip to main content

Custom logging handler for AskLora projects

Project description

LORA Logger

This package contains both the customised handler for saving the logs into a elasticsearch database, and a factory for creating customised loggers that can use that handler.

The customised handler will forward the logs to an existing logging service through our own celery service. This logging service will handle the logs and input them into the database. This is done to unionise the logs into a single database.

Diagram

flowchart LR

    services["Services\n<small>All backend projects\nthat need the logging\nsystem</small>"]-->producer[[Producer]]
    subgraph "LoraLogger package"
    producer-->queue[Queue]
    end
    queue-->consumer[[Consumer]]
    subgraph "AskLora logger service"
    consumer-->database[(<small>ElasticSearch\nDatabase</small>)]
    end

How to use

Currently, this package exports a logging handler. Loggers with this handler will be automatically send the records to the elasticsearch server set using the environment variable.

Package installation

there are two ways to install this pacakge

  • install the package locally. first, build the project:
    poetry build
    
    then you can install using pip
    pip install /path/to/logger/dist/loralogger-0.2.0-py3-none-any.whl
    
  • Install the package from pip
    pip install loralogger
    

Using this package

First, set these environment variables:

BROKER_URL=<your-rabbitmq-endpoint>
RESULT_BACKEND=<your-redis-endpoint>

Then you can use the logger in two different ways:

  1. Use the logger factory

    • import the logger factory

      from loralogger import LoraLogger
      
    • create a logger instance, the logger name should point to the Elasticsearch index name you want to send the logs into, with the word "-logs" appended to it (this, for instance, will send the logs to backend-logs index)

      test_logger = LoraLogger.get_logger('backend')
      
    • use the logger

      test_logger.warning("Careful!")
      
  2. Use the handler directly to your own logger instance:

    • import the handler

      from loralogger import LogToESHandler
      
    • initialise logging instance

      backend_logger = logging.getLogger("backend")
      
    • Create the handler instance, same as the above, the label should point to an existing Elasticsearch index

      handler = LogToESHandler(label="backend")
      
    • add the handler instance to the logger

      backend_logger.addHandler(handler)
      
    • Use the logger

      backend_logger.info("This is an info")
      

Notes

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

loralogger-0.2.1.tar.gz (4.6 kB view details)

Uploaded Source

Built Distribution

loralogger-0.2.1-py3-none-any.whl (5.6 kB view details)

Uploaded Python 3

File details

Details for the file loralogger-0.2.1.tar.gz.

File metadata

  • Download URL: loralogger-0.2.1.tar.gz
  • Upload date:
  • Size: 4.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.0b2 CPython/3.8.10 Linux/5.15.0-41-generic

File hashes

Hashes for loralogger-0.2.1.tar.gz
Algorithm Hash digest
SHA256 8db916e2c752f632e0d78308f78b262f28c5969f566bbea903df8506efd2def8
MD5 d670af9207e4b53c32b0e75910699a51
BLAKE2b-256 f24cdad98bec1ada835b2e3ae5e6836803a11b182fcf0df7d727c7007fd6e8df

See more details on using hashes here.

File details

Details for the file loralogger-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: loralogger-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 5.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.0b2 CPython/3.8.10 Linux/5.15.0-41-generic

File hashes

Hashes for loralogger-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 eb7c74b262248d3c765005fcc28cf0e756c48844c314f04e10f0b437dfe1dd34
MD5 b2735ce89258351b71df49e0d6136f1e
BLAKE2b-256 56c983405d9ba6b20fc530f0a48ed23c1448810e1b0154497a69e1039a7e9ec3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page