Skip to main content

Custom logging handler for AskLora projects

Project description

LORA Logger

This package contains both the customised handler for saving the logs into a elasticsearch database, and a factory for creating customised loggers that can use that handler.

The customised handler will forward the logs to an existing logging service through our own celery service. This logging service will handle the logs and input them into the database. This is done to unionise the logs into a single database.

Diagram

flowchart LR

    services["Services\n<small>All backend projects\nthat need the logging\nsystem</small>"]-->producer[[Producer]]
    subgraph "LoraLogger package"
    producer-->queue[Queue]
    end
    queue-->consumer[[Consumer]]
    subgraph "AskLora logger service"
    consumer-->database[(<small>ElasticSearch\nDatabase</small>)]
    end

How to use

Currently, this package exports a logging handler. Loggers with this handler will be automatically send the records to the elasticsearch server set using the environment variable.

Package installation

there are two ways to install this pacakge

  • install the package locally. first, build the project:
    poetry build
    
    then you can install using pip
    pip install /path/to/logger/dist/loralogger-0.3.0-py3-none-any.whl
    
    or if youre using poetry, use:
    poetry add /path/to/logger/dist/loralogger-0.3.0-py3-none-any.tar.gz
    
  • Install the package from pip
    pip install loralogger
    

Using this package

First, set these environment variables:

# Set amqp backend
AMQP_BROKER=localhost
AMQP_PORT=5672
AMQP_USER=rabbitmq
AMQP_PASSWORD=rabbitmq

# set results backend
REDIS_HOST=localhost
REDIS_PORT=6379

# set sentinel mode
REDIS_SENTINEL=False  # or True

Then you can use the logger in two ways:

  1. Use dedicated logger instances for specific projects. These will be automatically log to Elasticsearch (i.e. using the ESHandler)

    • import the from loralogger logger factory
    from loralogger import LoggerInstances, LoraLogger
    
    • get the logger instance with the LoggerInstances enum as label (preferred), or you can also use other labels by passing a string

      askloraxalpaca_logger = LoraLogger.get_logger(
         LoggerInstances.ASKLORAXALPACA,
         log_to_es=True,  # Send logs to Elasticsearch, defaults to True
         log_to_console=True,
      )
      
    • Use the logger instance

      askloraxalpaca_logger.info("This works!")
      
  2. Use the handler directly to your own logger instance:

    • import the handler

      from loralogger import LogToESHandler
      
    • initialise logging instance

      import logging
      backend_logger = logging.getLogger("backend")
      
    • Create the handler instance, supplying a label to it. The logs will be written to logs-loralogger-<your-label>, make sure you have the privilege to write to or create the index.

      handler = LogToESHandler('ledger')
      
    • add the handler instance to the logger

      backend_logger.addHandler(handler)
      
    • And finally, use the logger

      backend_logger.info("This is an info")
      

Features

  • Send your logs to Elasticsearch by setting send_to_es argument to True when initialising your logger, i.e.
    logger = LoraLogger.getLogger("backend", log_to_es=True)
    
  • Normally, sending logs to Elasticsearch will use RabbitMQ to not block the running operation, but you can skip it and send the logs directly using Elasticsearch API by setting skip_queue argument to True:
    logger = LoraLogger.get_logger(
      'ledger',
      log_to_es=True,
      skip_queue=True,
    )
    
  • You can use event and id to categorise your logs further in Elasticsearch:
    logger.info('Job started', event='calling-api', id='job-1)
    

Notes

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

loralogger-0.4.1.tar.gz (9.7 kB view hashes)

Uploaded Source

Built Distribution

loralogger-0.4.1-py3-none-any.whl (10.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page