Skip to main content

Custom logging handler for AskLora projects

Project description

LORA Logger

This package contains both the customised handler for saving the logs into a elasticsearch database, and a factory for creating customised loggers that can use that handler.

The customised handler will forward the logs to an existing logging service through our own celery service. This logging service will handle the logs and input them into the database. This is done to unionise the logs into a single database.

Diagram

flowchart LR

    services["Services\n<small>All backend projects\nthat need the logging\nsystem</small>"]-->producer[[Producer]]
    subgraph "LoraLogger package"
    producer-->queue[Queue]
    end
    queue-->consumer[[Consumer]]
    subgraph "AskLora logger service"
    consumer-->database[(<small>ElasticSearch\nDatabase</small>)]
    end

How to use

Currently, this package exports a logging handler. Loggers with this handler will be automatically send the records to the elasticsearch server set using the environment variable.

Package installation

there are two ways to install this pacakge

  • install the package locally. first, build the project:
    poetry build
    
    then you can install using pip
    pip install /path/to/logger/dist/loralogger-0.3.0-py3-none-any.whl
    
    or if youre using poetry, use:
    poetry add /path/to/logger/dist/loralogger-0.3.0-py3-none-any.tar.gz
    
  • Install the package from pip
    pip install loralogger
    

Using this package

First, set these environment variables:

# Set amqp backend
AMQP_BROKER=localhost
AMQP_PORT=5672
AMQP_USER=rabbitmq
AMQP_PASSWORD=rabbitmq

# set results backend
REDIS_HOST=localhost
REDIS_PORT=6379

# set sentinel mode
REDIS_SENTINEL=False  # or True

Then you can use the logger in two ways:

  1. Use dedicated logger instances for specific projects. These will be automatically log to Elasticsearch (i.e. using the ESHandler)

    • import the from loralogger logger factory
    from loralogger import LoggerInstances, LoraLogger
    
    • get the logger instance with the LoggerInstances enum as label (preferred), or you can also use other labels by passing a string

      askloraxalpaca_logger = LoraLogger.get_logger(
         LoggerInstances.ASKLORAXALPACA,
         log_to_es=True,  # Send logs to Elasticsearch, defaults to True
         log_to_console=True,
      )
      
    • Use the logger instance

      askloraxalpaca_logger.info("This works!")
      
  2. Use the handler directly to your own logger instance:

    • import the handler

      from loralogger import LogToESHandler
      
    • initialise logging instance

      import logging
      backend_logger = logging.getLogger("backend")
      
    • Create the handler instance, supplying a label to it. The logs will be written to logs-loralogger-<your-label>, make sure you have the privilege to write to or create the index.

      handler = LogToESHandler('ledger')
      
    • add the handler instance to the logger

      backend_logger.addHandler(handler)
      
    • And finally, use the logger

      backend_logger.info("This is an info")
      

Features

  • Send your logs to Elasticsearch by setting send_to_es argument to True when initialising your logger, i.e.
    logger = LoraLogger.getLogger("backend", log_to_es=True)
    
  • Normally, sending logs to Elasticsearch will use RabbitMQ to not block the running operation, but you can skip it and send the logs directly using Elasticsearch API by setting skip_queue argument to True:
    logger = LoraLogger.get_logger(
      'ledger',
      log_to_es=True,
      skip_queue=True,
    )
    
  • You can use event and id to categorise your logs further in Elasticsearch:
    logger.info('Job started', event='calling-api', id='job-1)
    

Notes

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

loralogger-0.4.1.tar.gz (9.7 kB view details)

Uploaded Source

Built Distribution

loralogger-0.4.1-py3-none-any.whl (10.3 kB view details)

Uploaded Python 3

File details

Details for the file loralogger-0.4.1.tar.gz.

File metadata

  • Download URL: loralogger-0.4.1.tar.gz
  • Upload date:
  • Size: 9.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.4

File hashes

Hashes for loralogger-0.4.1.tar.gz
Algorithm Hash digest
SHA256 20a4952395d1c489affff0da9ddaa130cf2502958453161f850d78d988ea65a1
MD5 5f801087db2e6f0a98b301a5bb9b3d9c
BLAKE2b-256 af59e6cf615e46489bb19f71b1a8b5fb92fd25fccda24bf490640861524d5aec

See more details on using hashes here.

File details

Details for the file loralogger-0.4.1-py3-none-any.whl.

File metadata

  • Download URL: loralogger-0.4.1-py3-none-any.whl
  • Upload date:
  • Size: 10.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.4

File hashes

Hashes for loralogger-0.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0227c9bcf3986156b325c483b1efc89a2cbbbf5b392ae8bc81eeef3f63b58068
MD5 91eed193ef3d8c15a18be9e5ce55c91e
BLAKE2b-256 958aeab49a3a71f5440c4cd390c72131d8c79fff15ea8faa12fc801060430dac

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page