Skip to main content

Custom logging handler for AskLora projects

Project description

LORA Logger

This package contains both the customised handler for saving the logs into a elasticsearch database, and a factory for creating customised loggers that can use that handler.

The customised handler will forward the logs to an existing logging service through our own celery service. This logging service will handle the logs and input them into the database. This is done to unionise the logs into a single database.

Diagram

flowchart LR

    services["Services\n<small>All backend projects\nthat need the logging\nsystem</small>"]-->producer[[Producer]]
    subgraph "LoraLogger package"
    producer-->queue[Queue]
    end
    queue-->consumer[[Consumer]]
    subgraph "AskLora logger service"
    consumer-->database[(<small>ElasticSearch\nDatabase</small>)]
    end

How to use

Currently, this package exports a logging handler. Loggers with this handler will be automatically send the records to the elasticsearch server set using the environment variable.

Package installation

there are two ways to install this pacakge

  • install the package locally. first, build the project:
    poetry build
    
    then you can install using pip
    pip install /path/to/logger/dist/loralogger-0.2.2-py3-none-any.whl
    
  • Install the package from pip
    pip install loralogger
    

Using this package

First, set these environment variables:

# Set amqp backend
AMQP_BROKER=localhost
AMQP_PORT=5672
AMQP_USER=rabbitmq
AMQP_PASSWORD=rabbitmq

# set results backend
REDIS_HOST=localhost
REDIS_PORT=6379

Then you can use the logger in two different ways:

  1. Use the logger factory

    • import the logger factory

      from loralogger import LoraLogger
      
    • create a logger instance, the logger name should point to the Elasticsearch index name you want to send the logs into, with the word "-logs" appended to it (this, for instance, will send the logs to backend-logs index)

      test_logger = LoraLogger.get_logger('backend',  log_to_es=True)  # We need to set this on or it wont send to Elasticsearch
      
    • use the logger

      test_logger.warning("Careful!")
      
  2. Use the handler directly to your own logger instance:

    • import the handler

      from loralogger import LogToESHandler
      
    • initialise logging instance

      backend_logger = logging.getLogger("backend")
      
    • Create the handler instance, same as the above, the label should point to an existing Elasticsearch index

      handler = LogToESHandler(label="backend")
      
    • add the handler instance to the logger

      backend_logger.addHandler(handler)
      
    • Use the logger

      backend_logger.info("This is an info")
      

Notes

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

loralogger-0.2.3.tar.gz (4.8 kB view details)

Uploaded Source

Built Distribution

loralogger-0.2.3-py3-none-any.whl (5.9 kB view details)

Uploaded Python 3

File details

Details for the file loralogger-0.2.3.tar.gz.

File metadata

  • Download URL: loralogger-0.2.3.tar.gz
  • Upload date:
  • Size: 4.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.0b3 CPython/3.9.13 Linux/5.15.0-41-generic

File hashes

Hashes for loralogger-0.2.3.tar.gz
Algorithm Hash digest
SHA256 594881c989dcc2afd8f5e7a027ed00d7be706c22a230a437f449a71809bef804
MD5 21d6faabbd6521816cbdfe3fe3e51d49
BLAKE2b-256 f87b30de5a45919467328f95eb4a0704a4d6ac54080d18a1319bff8b30d342f5

See more details on using hashes here.

File details

Details for the file loralogger-0.2.3-py3-none-any.whl.

File metadata

  • Download URL: loralogger-0.2.3-py3-none-any.whl
  • Upload date:
  • Size: 5.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.0b3 CPython/3.9.13 Linux/5.15.0-41-generic

File hashes

Hashes for loralogger-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 f16d22af457f2edca98234caa02691efce4fac779704f62f95fc89cb81749b55
MD5 e81b3fdd396db359a0197a3280d22c8f
BLAKE2b-256 42c6e2ecff820900926aac25abfa2a3bea3df07c8355ebe95ff8ca4444f7d4d2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page