Skip to main content

Package for logging to Azure Logs Ingestion API using decorators and regular logging statements.

Project description

Introduction

This logs_ingestion package provides several methods for logging data into Azure Monitor using the Logs Ingestion API.

The following attributes are part of the logging with the coding attribute names between brackets:

  • TimeGenerated (time_generated): the datetime the logging entry was created (required)
  • Message (message): the message of the log entry (optional)
  • Duration (duration): the duration of the function in case the decorator is used (optional)
  • Status (status): the status of the system (optional)
  • RunId (run_id): the run ID of the flow that is being processed (required)
  • Tag (tag): the tag for grouping log entries (required)

The attribute TimeGenerated is automatically set for you. The RunId and Tag are part of the configuration of the logger you need to create and only need to be set once per logger.

Usage

You'll need a logger to perform the actual logging:

logger: Logger = get_logger(__name__, run_id="42", tag="logger1")

With the logger instantiation you'll also set the RunId and Tag to be used in all logging entries as generated through this logger.

The first method for logging information is by using a decorator in your Python code:

@time_and_log(logger=logger, message="bla", status="timed")
def my_function():
    pass

Whenever the my_function() is called a log entry is created with when the function end that automatically includes the duration of the function call. This is a convenient way for monitoring the performance of functions and the possible drift in processing times. With the message and status arguments you can add additional details to the message logged.

The second method is by calling the usual logging lines, for example:

from logs_ingestion.logs_record import LogsRecord
logger.warning(message='testing azure logging', record=LogsRecord(
               status="OK",
               duration=1.23))

The arguments are:

  • message, speaks for itself
  • record, the record(s) to be logged

The record argument must be either a LogsRecord or a list of LogsRecords. By using a list, you can simply log a whole batch of log records in one command. The rund_id and tag from the logger are pushed down to the individual log messages.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

logs_ingestion-0.2.5.tar.gz (4.2 kB view details)

Uploaded Source

Built Distribution

logs_ingestion-0.2.5-py3-none-any.whl (5.5 kB view details)

Uploaded Python 3

File details

Details for the file logs_ingestion-0.2.5.tar.gz.

File metadata

  • Download URL: logs_ingestion-0.2.5.tar.gz
  • Upload date:
  • Size: 4.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.3 Darwin/23.5.0

File hashes

Hashes for logs_ingestion-0.2.5.tar.gz
Algorithm Hash digest
SHA256 30d3a8f436fec5204655b260d8d8f91ff8b027e8fd227e0653e990948962069e
MD5 af86a9003d81fe5cc3e5e00766a7d514
BLAKE2b-256 210f07f9678dabaa7869acdc257792b3b3598e88f40be14e102761e3db381a07

See more details on using hashes here.

File details

Details for the file logs_ingestion-0.2.5-py3-none-any.whl.

File metadata

  • Download URL: logs_ingestion-0.2.5-py3-none-any.whl
  • Upload date:
  • Size: 5.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.3 Darwin/23.5.0

File hashes

Hashes for logs_ingestion-0.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 3054f757484d18052b2384dee299be31ce958abb046d4b28e5be540f45f97da9
MD5 04f52d4ae26c6c8967a7334c3895ae61
BLAKE2b-256 4b2c8764b1c2f3b5611586f28008631ee89f6fe8234e2c4d893aead1128978b6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page