Skip to main content

JSON formatter for python logging

Project description

JSON formatter for logging

pypi version Build status Coverage Code style: black Number of tests Number of downloads

This module provides a JSON formatter for the python logging module that will format to JSON formatted string.

Using this formatter allows to have the proper format for logging to Splunk or ElasticSearch, but it can also be used for logging to stdout as a string is issued.

Features

Adding additional fields and values

You can add fields to every message that is being logged. To do so, specify the fields parameter to the logging_json.JSONFormatter instance.

It must be a dictionary where keys are the keys to be appended to the resulting JSON dictionary (if not already present) and the values can be one of the following:

  • An attribute of the logging record (non-exhaustive list can be found on the python logging documentation).
  • If not found on the record, the value will be linked to the key.

Logging exceptions, a specific case

If an exception is logged, the exception key will be appended to the resulting JSON dictionary.

This dictionary will contain 3 keys:

  • type: The name of the exception class (useful when the message is blank).
  • message: The str representation of the exception (usually the provided error message).
  • stack: The stack trace, formatted as a string.

You can rename the exception field key by setting the exception_field_name parameter with a new name for the key. It is also possible to disable this behaviour by setting the exception_field_name parameter to None or an empty string

Logging with a dictionary

This formatter allows you to log dictionary as in the following:

import logging

logging.info({"key": "value", "other key": "other value"})

The resulting JSON dictionary will be the one you provided (with the additional fields).

Logging with anything else (such as a string)

Anything not logged using a dictionary will be handled by the standard formatter, and it can result in one of the 2 output:

  • A JSON dictionary, if additional fields are set or if extra parameter is used while logging, with the message available in the message key of the resulting JSON dictionary. Default message key name can be changed by message_field_name parameter of the logging_json.JSONFormatter instance.
  • The formatted record, if no additional fields are set.

This handles the usual string logging as in the following:

import logging

logging.info("This is my message")

Changing asctime representation

You can override the default representation of asctime (2003-07-08 16:49:45,896) based on two different scenarii:

Without milliseconds

Set datefmt parameter.

Setting datefmt to %Y-%m-%dT%H:%M:%S would result in 2003-07-08T16:49:45.

With milliseconds

Set default_time_format to something else than %Y-%m-%d %H:%M:%S to change the representation part without milliseconds. Set default_msec_format to something else than %s,%03d to change the representation milliseconds. Note that %s in default_msec_format is going to be replaced by the representation without milliseconds.

Setting default_time_format to %Y-%m-%dT%H:%M:%S and default_msec_format to %s.%03d would result in 2003-07-08T16:49:45.896.

Configuration

You can create a formatter instance yourself as in the following, or you can use a logging configuration.

import logging_json

formatter = logging_json.JSONFormatter(fields={
    "level_name": "levelname",
    "thread_name": "threadName",
    "process_name": "processName"
})

Using logging.config.dictConfig

You can configure your logging as advertise by python, by using the logging.config.dictConfig function.

dict configuration

import logging.config

logging.config.dictConfig({
    "version": 1,
    "formatters": {
        "json": {
            '()': 'logging_json.JSONFormatter',
            'fields':{
                "level_name": "levelname",
                "thread_name": "threadName",
                "process_name": "processName"
            }
        }
    },
    "handlers": {
        "standard_output": {
            'class': 'logging.StreamHandler',
            'formatter': 'json',
            'stream': 'ext://sys.stdout'
        },
    },
    "loggers": {
        "my_app": {"level": "DEBUG"}
    },
    "root": {
        "level": "INFO",
        "handlers": ["standard_output"]
    }
})

YAML logging configuration

You can use YAML to store your logging configuration, as in the following sample:

import logging.config
import yaml

with open("path/to/logging_configuration.yaml", "r") as config_file:
    logging.config.dictConfig(yaml.load(config_file))

Where logging_configuration.yaml can be a file containing the following sample:

version: 1
formatters:
  json:
    '()': logging_json.JSONFormatter
    fields:
      level_name: levelname
      thread_name: threadName
      process_name: processName
handlers:
  standard_output:
    class: logging.StreamHandler
    formatter: json
    stream: ext://sys.stdout
loggers:
  my_app:
    level: DEBUG
root:
  level: INFO
  handlers: [standard_output]

How to install

  1. python 3.7+ must be installed
  2. Use pip to install module:
python -m pip install logging_json

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

logging_json-0.5.0.tar.gz (12.1 kB view details)

Uploaded Source

Built Distribution

logging_json-0.5.0-py3-none-any.whl (7.2 kB view details)

Uploaded Python 3

File details

Details for the file logging_json-0.5.0.tar.gz.

File metadata

  • Download URL: logging_json-0.5.0.tar.gz
  • Upload date:
  • Size: 12.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.1

File hashes

Hashes for logging_json-0.5.0.tar.gz
Algorithm Hash digest
SHA256 ca4be5c56e7d5ee979dca284ee86ee6bfb6040ae92114c56c8cbefbdf3fa47c4
MD5 cebc5b8d0ae06d8413f056af9ca5cc1b
BLAKE2b-256 c625e57be29b8ceccedd3c271b18b2fd6546bf3a8effa12d74fa116a0993589a

See more details on using hashes here.

File details

Details for the file logging_json-0.5.0-py3-none-any.whl.

File metadata

File hashes

Hashes for logging_json-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ceefd2799a40bbce219b749c681c031868608b721272387d95e3fcc3ee2a8cae
MD5 8f387804c384bd7bb7b03e571f519c81
BLAKE2b-256 c55ba460ea20a1613080b5fc0f7a98aa6c2dcc8aace05eea354f39231f3e1237

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page