Skip to main content

A collection of useful logging formatters and filters. JSON Formatter, Extra Formatter, ISO Time Filter, Flask Filter, Django Filter, ...

Project description

Python logging utilities

Build Status PyPI version

This package implements some useful logging utilities. Here below are the main features of the package:

  • JSON formatter
  • Text formatter with extra
  • Flask request context record attributes
  • Jsonify Django request record attribute
  • ISO Time in format YYYY-MM-DDThh:mm:ss.sss±hh:mm
  • Add constant record attributes
  • Logger Level Filter

All features can be fully configured from the configuration file.

NOTE: only python 3 is supported

:warning: Version 2.x.x BREAKING CHANGES see Breaking Changes

Table of content

Installation

logging_utilities is available on PyPI.

Use pip to install:

pip install logging-utilities

Release and Publish

Only owners are allowed to publish a new version to PyPI. To publish a new version follow the procedure below:

  1. Increase the VERSION in logging_utilities/__init__.py

    • Major version for outbreak changes in the user interface (no backward compatibility)
    • Minor version for new features
    • Patch version for bug fixes
    • For alpha version append alpha1 to VERSION
  2. Commit and push the changes to develop branch

  3. Merge develop to master

  4. From master branch enter

    summon -p gopass --up make publish
    

NOTE: this requires to have summon, gopass and the correct secrets.yml file in a parent folder.

Contribution

Every contribution to this library is welcome ! So if you find a bug or want to add a new feature everyone is welcome to open an issue or created a Pull Request.

Any contribution must follow the git-flow.

Developer

You can quickly setup your environment with the makefile:

make setup

This will create a virtual python environment with all packages required for the development.

Note that for pull request, the code MUST BE with yapf formatted and it also MUST PASS the linter. For this you can use the make targets:

make format
make lint
#or
make format-lint

Any new feature should have its unittest class in order to be tested.

Ignore missing log record attribute in formatter

When configuring a log formatter you can provide via print style any log record attribute including extra attributes. However when using extra attribute, if this attribute is then missing (e.g. because the logger did not add that extra) then the logging would raise a ValueError: Formatting field not found in record: ....

For the standard Formatter you could use the Extra Formatter, but if you have any other Formatter you can use the global logging_utilities.log_record.set_log_record_ignore_missing_factory() method.

LogRecordIgnoreMissing

The LogRecordIgnoreMissing factory can be used to avoid ValueError exception when formatting a log message from a log record that don't have the extra required by the formatter.

For example:

import logging

logging.basicConfig(format="%(message)s - %(extra_param)s", level=logging.INFO, force=True)

logger = logging.getLogger('my-logger')

logger.info('My message', extra={'extra_param': 20})
My message - 20

logger.info('My second message')
--- Logging error ---
Traceback (most recent call last):
  File "/usr/lib/python3.8/logging/__init__.py", line 440, in format
    return self._format(record)
  File "/usr/lib/python3.8/logging/__init__.py", line 436, in _format
    return self._fmt % record.__dict__
KeyError: 'extra_param'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.8/logging/__init__.py", line 1085, in emit
    msg = self.format(record)
  File "/usr/lib/python3.8/logging/__init__.py", line 929, in format
    return fmt.format(record)
  File "/usr/lib/python3.8/logging/__init__.py", line 671, in format
    s = self.formatMessage(record)
  File "/usr/lib/python3.8/logging/__init__.py", line 640, in formatMessage
    return self._style.format(record)
  File "/usr/lib/python3.8/logging/__init__.py", line 442, in format
    raise ValueError('Formatting field not found in record: %s' % e)
ValueError: Formatting field not found in record: 'extra_param'
...

To avoid such crash you can use LogRecordIgnoreMissing that will replace missing extra attributes by an empty string in the message.

import logging
from logging_utilities.log_record import LogRecordIgnoreMissing

logging.setLogRecordFactory(LogRecordIgnoreMissing)

logging.basicConfig(format="%(message)s - %(extra_param)s", level=logging.INFO, force=True)

logger = logging.getLogger('my-logger')

logger.info('My message', extra={'extra_param': 20})
My message - 20

logger.info('My second message')
My second message -

You can also change the default value by using the helper set_log_record_ignore_missing_factory()

import logging
from logging_utilities.log_record import set_log_record_ignore_missing_factory

set_log_record_ignore_missing_factory('my-default')

logging.basicConfig(format="%(message)s - %(extra_param)s", level=logging.INFO, force=True)

logger = logging.getLogger('my-logger')

logger.info('My message', extra={'extra_param': 20})
My message - 20

logger.info('My second message')
My second message - my-default

:warning: NOTE that setting the log record factory is a global action that affects every logger and formatter

JSON Formatter

JsonFormatter is a python logging formatter that transform the log output into a json object.

JSON log format is quite useful especially when the logs are sent to LogStash.

This formatter supports embedded object as well as array.

Configure JSON Format

The format can be configured either using the format config parameter or the fmt constructor parameter. This parameter should be a dictionary (for Python version below 3.7, it is better to use OrderedDict to keep the attribute order). Each key is taken as such as key for the output JSON object, while each value is transformed as follow in the output:

Value Type Transformation Example
LogRecord attribute string The string is a LogRecord attribute name,
then the value of this attribute is used as output.
"message"
LogRecord attribute dotted key string The string is a dotted key to access a sub key of a LogRecord dictionary attribute.
For example if the LogRecord contains a dictionary attribute added via an extra, you can use the dotted notation to access only a sub object/value of this dictionary. Note if the dotted key attribute doesn't exists it will use the default value which is '' unless the dotted key has a trailing . then the default value will be {} instead of ''.
Trailing dot can also be used to differentiate between a constant and a dictionary LogRecord attribute, for example if you have some logs that add a dictionary called headers to the extra but that not all logs contains this extra, you can defined it as follow in fmt; request_headers: headers., this way headers. will be replaced by the extras headers dictionary if available and if not it will be replaced by an empty dictionary (without trailing dot it would have been treated as a constant)
"request.path"
Named string format string The string contains named string format,
each named format are replaced by the corresponding
LogRecord attribute value.
"%(asctime)s.%(msecs)s"
String constant string If the string value doesn't match any of the above, it is added as constant. "my-constant-value"
Object dict The object is embedded in the output with its value
following the same rules as defined in this table.
{"lineno": "lineno", "file": "filename", "id": "%(process)x/%(thread)x", "message": "message"}
Array list The list is embedded as an array in the output.
Each value is processed using the rules from this table
["created", "asctime", "message", "%(process)x/%(thread)x"]

You can find the LogRecord attributes list in Python Doc

See below the Basic Usage for more examples.

JSON Formatter Options

You can change some behavior using the JsonFormatter constructor:

Parameter Type Default Description
fmt dict {'levelname': 'levelname', 'name': 'name', 'message': 'message'} Define the output format, see Configure JSON Format
datefmt string None Date format for asctime, see time.strftime()
style string % String formatting style, see logging.Formatter
add_always_extra bool False When True, logging extra (logging.log('message', extra={'my-extra': 'some value'})) are always added to the output. Otherwise they are only added if present in fmt.
filter_attributes list None When the formatter is used with a Logging.Filter that adds LogRecord attributes, they can be listed here to avoid to be treated as logging extra.
remove_empty bool False When True, empty values (empty list, dict, None or empty string) are removed from output.
ignore_missing bool False If True, then all extra attributes from the log record that are missing (accessed by the fmt parameter) will be replaced by an empty string instead of raising a ValueError exception. NOTE: This has an impact on all formater not only on this one, see LogRecordIgnoreMissing.

The constructor parameters can be also be specified in the log configuration file using the () class specifier instead of class:

formatters:
  json:
    (): logging_utilities.formatters.json_formatter.JsonFormatter
    add_always_extra: True
    fmt:
      time: asctime
      level: levelname
      logger: name
      module: module
      message: message

:warning: When using the INI file format like documented here, you cannot use the JSON formatter options describe above and have to use the formatter using the class, format, datefmt and style attributes like below

[formatters]
keys = my_json

[formatter_my_json]
class = logging_utilities.formatters.json_formatter.JsonFormatter
format: {
        "time": "asctime",
        "level": "levelname",
        "logger": "name",
        "module": "module",
        "function": "funcName",
        "pid_tid": "%(process)x/%(thread)x",
        "message": "message",
        "exc_info": "exc_info"
    } # OPTIONAL
datefmt = %Y-%m-%d %H:%M # OPTIONAL
style = % # OPTIONAL

Extra Formatter

This formatter enhance the python standard formatter to allow working with the log extra. When adding an extra keyword in the format, the python standard formatter raises a ValueError() when this keyword is missing from log record. This means that if you want to display a log extra, you have to make sure that every log message contains this extra.

This formatter allow you to provide an extra_fmt parameter that will add record extra to the log message when available. You can either add the entire extra dictionary: extra_fmt='%s' or only some extras: extra_fmt='%(extra1)s:%(extra2)s'. In the latest case, when a key is missing in extra, the value is replaced by extra_default.

When using the whole extra dictionary, you can use extra_pretty_print to improve the formatting, note that in this case the log might be on multiline (this use pprint.pformat).

See logging.Logger.debug for more infos on the logging extra

Extra Formatter Constructor

Support the same arguments as the logging.Formatter plus the followings:

Parameter Type Default Description
extra_fmt None|str None When not None, adds the extra at the end of the log message. Either uses named placeholder with the extra keywords or add the whole extra directory using %s.
extra_default None|str '' When extra_fmt contains named placeholders and one or more of these placeholders are not found in the log record, then the formatter uses this default value instead.
extra_default any '' When using extra_fmt with named placeholders and a keyword is missing in the log record, it is then replaced by this value.
extra_pretty_print boolean False When extra_fmt='%s' you can set this flag to True to use pprint.pformat on the dictionary.
pretty_print_kwargs None|dict None kwargs as dictionary to pass to pprint.pformat

Extra Formatter Config Example

formatters:
  standard:
    (): logging_utilities.formatters.extra_formatter.ExtraFormatter
    format: "%(levelname)s - %(name)s - %(message)s"
    extra_fmt: " - extra:\n%s"
    extra_pretty_print: True

NOTE: ExtraFormatter only support the special key '()' factory in the configuration file (it doesn't work with the normal 'class' key).

Flask Request Context

When using logging within a Flask application, you can use this Filter to add some context attributes to all LogRecord.

All Flask Request attributes are supported and they are added as LogRecord with the flask_request_ prefix. See Flask Request for more details on available attributes.

Flask Request Context Filter Constructor

Parameter Type Default Description
attributes list None List of Flask Request attributes name to add to the LogRecord

Flask Request Context Config Example

version: 1

root:
  handlers:
    - console
  level: DEBUG
  propagate: True

filters:
  flask:
    (): logging_utilities.filters.flask_attribute.FlaskRequestAttribute
    attributes:
      - url
      - method
      - headers
      - json

formatters:
  console:
    format: "%(asctime)s - %(message)s - %(flask_request_url)s %(flask_request_method)s %(flask_request_headers)s: %(flask_request_json)s"

handlers:
  console:
    class: logging.StreamHandler
    formatter: console
    stream: ext://sys.stdout
    filters:
      - flask

NOTE: FlaskRequestAttribute only support the special key '()' factory in the configuration file (it doesn't work with the normal 'class' key).

Jsonify Django Request

If you want to log the Django HttpRequest object using the JSON Formatter, this filter is for made for you. It converts the record.request attribute to a valid json object or a string if the attribute is not an HttpRequest instance. It is also useful when using Django with the JSON Formatter because Django adds in some of its logs either an HttpRequest object to the log extra or a socket object.

The HttpRequest attributes that are converted can be configured using the include_keys and/or exclude_keys filter parameters. This can be useful if you want to limit the log data, for example if you don't want to log Authentication headers.

Usage

Add the filter to the log handler and then add simply the HttpRequest to the log extra as follow:

logger.info('My message', extra={'request': request})

Django Request Filter Constructor

Parameter Type Default Description
include_keys list None All request attributes that match any of the dotted keys of the list will be jsonify in the record.request. When None then all attributes are added (default behavior).
exclude_keys list None All request attributes that match any of the dotted keys of the list will not be added to the jsonify of the record.request. NOTE this has precedence to include_keys which means that if a key is in both list, then it is not added.

Django Request Config Example

filters:
  django:
    (): logging_utilities.filters.django_request.JsonDjangoRequest
    include_keys:
      - request.META.REQUEST_METHOD
      - request.META.SERVER_NAME
      - request.environ
    exclude_keys:
      - request.META.SERVER_NAME
      - request.environ.wsgi

NOTE: JsonDjangoRequest only support the special key '()' factory in the configuration file (it doesn't work with the normal 'class' key).

ISO Time with Timezone

The standard logging doesn't support the time as ISO with timezone; YYYY-MM-DDThh:mm:ss.sss±hh:mm. By default asctime uses a ISO like format; YYYY-MM-DD hh:mm:ss.sss, but without T separator (although this one could be configured by overriding a global variable, this can't be done by config file). You can use the datefmt option to specify another date format, however this one don't supports milliseconds, so you could achieve this format: YYYY-MM-DDThh:mm:ss±hh:mm.

This Filter can be used to achieve the full ISO 8601 Time format including timezone and milliseconds.

ISO Time Filter Constructor

Parameter Type Default Description
isotime bool True Add log local time as isotime attribute to LogRecord with the YYYY-MM-DDThh:mm:ss.sss±hh:mm format.
utc_isotime bool False Add log UTC time as utc_isotime attribute to LogRecord with the YYYY-MM-DDThh:mm:ss.sss±hh:mm format.

ISO Time Config Example

filters:
  isotime:
    (): logging_utilities.filters.TimeAttribute
    utc_isotime: True
    isotime: False

NOTE: TimeAttribute only support the special key '()' factory in the configuration file (it doesn't work with the normal 'class' key).

Constant Record Attribute

Simple logging Filter to add constant attribute to every LogRecord

Constant Record Attribute Config Example

filters:
  application:
    (): logging_utilities.filters.ConstAttribute
    application: my-application

NOTE: ConstAttribute only support the special key '()' factory in the configuration file (it doesn't work with the normal 'class' key).

Logger Level Filter

Sometimes you might want to have different log Level based on the logger and handler. The standard logging library allow to set a logger level or a handler level but not based on both. Let say you have a config with two loggers logging to two handlers, on the first handler you want all messages of both loggers and on the second handler you want all messages of the first logger but only the WARNING messages of the second logger. This is here were this filter come into play.

Logger Level Filter Constructor

Parameter Type Default Description
level int | string 'DEBUG' All messages with a lower level than this one will be filtered out.
logger string '' When non empty, only message from this logger will be filtered out based on their level.

Logger Level Filter Config Example

root:
  handlers:
    - "console"
    - "file"
  level: "DEBUG"
  propagate: "True"

filters:
  B_filter:
    (): logging_utilities.filters.LevelFilter
    level: "WARNING"
    logger: 'B'

loggers:
  A:
    level: "DEBUG"
  B:
    level: "DEBUG"

handlers:
  console:
    class: "logging.StreamHandler"

  file:
    class: "logging.handlers.RotatingFileHandler"
    filters:
      - "B_filter"

NOTE: LevelFilter only support the special key '()' factory in the configuration file (it doesn't work with the normal 'class' key).

Basic Usage

Case 1. Simple JSON Output

import logging

from logging_utilities.formatters.json_formatter import basic_config

# default keyword parameter `format`: """{"levelname": "levelname", "name": "name", "message": "message"}"""
basic_config(level=logging.INFO)
logging.info('hello, json_formatter')

output:

{"levelname": "INFO", "name": "root", "message": "hello, json_formatter"}

Case 2. JSON Output Configured within Python Code

import logging

from logging_utilities.formatters.json_formatter import JsonFormatter

# `FORMAT` can be `json`, `OrderedDict` or `dict`.
# If `FORMAT` is `dict` and python version < 3.7.0, the output order is sorted by keys, otherwise it will be the same
# as the defined order.
#
# KEY := string, can be whatever you like.
# VALUE := `LogRecord` attribute name, string, formatted string (e.g. "%(asctime)s.%(msecs)s"), list or dict
FORMAT = {
    "Name":            "name",
    "Levelno":         "levelno",
    "Levelname":       "levelname",
    "Pathname":        "pathname",
    "Filename":        "filename",
    "Module":          "module",
    "Lineno":          "lineno",
    "FuncName":        "funcName",
    "Created":         "created",
    "Asctime":         "asctime",
    "Msecs":           "msecs",
    "RelativeCreated": "relativeCreated",
    "Thread":          "thread",
    "ThreadName":      "threadName",
    "Process":         "process",
    "Message":         "message"
}

root = logging.getLogger()
root.setLevel(logging.INFO)

formatter = JsonFormatter(FORMAT)

sh = logging.StreamHandler()
sh.setFormatter(formatter)
sh.setLevel(logging.INFO)

root.addHandler(sh)

def test():
  root.info("test %s format", 'string')

test()

output:

{
  "Name": "root", 
  "Levelno": 20, 
  "Levelname": "INFO", 
  "Pathname": "test.py", 
  "Filename": "test.py", 
  "Module": "test", 
  "Lineno": 75, 
  "FuncName": "test", 
  "Created": 1588185267.3198836, 
  "Asctime": "2020-04-30 02:34:27,319", 
  "Msecs": 319.8835849761963, 
  "RelativeCreated": 88.2880687713623, 
  "Thread": 16468, 
  "ThreadName": "MainThread", 
  "Process": 16828, 
  "Message": "test string format"
}

Case 3. JSON Output Configured with a YAML File

config.yaml:

version: 1

root:
  handlers:
    - console
  level: DEBUG
  propagate: True

formatters:
  json:
    class: logging_utilities.formatters.json_formatter.JsonFormatter
    format:
      time: asctime
      level: levelname
      logger: name
      module: module
      function: funcName
      process: process
      thread: thread
      message: message

handlers:
  console:
    class: logging.StreamHandler
    formatter: json
    stream: ext://sys.stdout

Then in your python code use it as follow:

import logging
import logging.config

import yaml


config = {}
with open('example-config.yaml', 'r') as fd:
    config = yaml.safe_load(fd.read())

logging.config.dictConfig(config)

root = logging.getLogger()
root.info('Test file config')

output:

{
  "function": "<module>", 
  "level": "INFO", 
  "logger": "root", 
  "message": "Test file config", 
  "module": "<stdin>", 
  "process": 12264, 
  "thread": 139815989413696, 
  "time": "asctime"
}

Case 4. Add Flask Request Context Attributes to JSON Output

config.yaml

version: 1

root:
  handlers:
    - console
  level: DEBUG
  propagate: True

filters:
  isotime:
    (): logging_utilities.filters.TimeAttribute
  flask:
    (): logging_utilities.filters.flask_attribute.FlaskRequestAttribute
    attributes:
      - url
      - method
      - headers
      - remote_addr
      - json

formatters:
  json:
    class: logging_utilities.formatters.json_formatter.JsonFormatter
    format:
      time: isotime
      level: levelname
      logger: name
      module: module
      function: funcName
      process: process
      thread: thread
      request:
        # We use the "%()s" notation here to ensure a string output and also if the LogRecord has
        # no flask context, meaning no `flask_request_url` attribute, the "%()s" notation ensure
        # to have an empty string instead of treating `flask_request_url` as a string constant.
        url: "%(flask_request_url)s"
        method: "%(flask_request_method)s"
        # We use a trailing dot here to ensure to have a dictionary output even if the LogRecord 
        # doesn't have a flask_request_headers attribute.
        headers: flask_request_headers.
        data: flask_request_json.
        remote: "%(flask_request_remote_addr)s"
      message: message

handlers:
  console:
    class: logging.StreamHandler
    formatter: json
    stream: ext://sys.stdout
    filters:
      - isotime
      - flask

NOTE: This require to have flask package installed otherwise it raises ImportError

Then in your python code use it as follow:

import logging
import logging.config

import yaml
from flask import Flask


config = {}
with open('example-config.yaml', 'r') as fd:
    config = yaml.safe_load(fd.read())

logging.config.dictConfig(config)

app = Flask('test')

root = logging.getLogger()

with app.test_request_context("path/test", method='GET', headers={"Accept": "*/*"}):
  root.info('Test file config')

output:

{
  "time": "2022-07-20T10:09:10.765237+02:00", 
  "level": "INFO",
  "logger": "root", 
  "module": "<stdin>", 
  "function": "<module>", 
  "process": 58043, 
  "thread": 139717802334016, 
  "request": {
    "url": "http://localhost/path/test", 
    "method": "GET", 
    "headers": {
      "Host": "localhost", 
      "Accept": "*/*"
    }, 
    "data": null, 
    "remote": null
  }, 
  "message": "Test file config"
}

Case 5. Add Django Request to JSON Output

config.yaml

version: 1

root:
  handlers:
    - console
  level: DEBUG
  propagate: True

filters:
  isotime:
    (): logging_utilities.filters.TimeAttribute
  django:
    (): logging_utilities.filters.django_request.JsonDjangoRequest
    include_keys:
      - request.path
      - request.method
      - request.headers
    exclude_keys:
      - request.headers.Authorization
      - request.headers.Proxy-Authorization

formatters:
  json:
    class: logging_utilities.formatters.json_formatter.JsonFormatter
    format:
      time: isotime
      level: levelname
      logger: name
      module: module
      function: funcName
      process: process
      thread: thread
      request: request
      response: response
      message: message

handlers:
  console:
    class: logging.StreamHandler
    formatter: json
    stream: ext://sys.stdout
    filters:
      - isotime
      - django

NOTE: This require to have django package installed otherwise it raises ImportError

Then in your python code use it as follow:

import logging
import logging.config

import yaml

from django.http import JsonResponse
from django.conf import settings
from django.test import RequestFactory


config = {}
with open('example-config.yaml', 'r') as fd:
    config = yaml.safe_load(fd.read())

logging.config.dictConfig(config)

logger = logging.getLogger('your_logger')

def my_page(request):
    answer = {'success': True}
    logger.info('My page requested', extra={'request': request, 'response': answer})
    return JsonResponse(answer)

settings.configure()
factory = RequestFactory()

my_page(factory.get('/my_page?test=true'))

output:

{
  "function": "my_page", 
  "level": "INFO", 
  "logger": "your_logger", 
  "message": "My page requested", 
  "module": "<stdin>", 
  "process": 20421, 
  "request": {
    "method": "GET", 
    "path": "/my_page", 
    "headers": {
      "Cookie": ""
    }
  }, 
  "response": {
    "success": true
  }, 
  "thread": 140433370822464, 
  "time": "2020-10-12T16:44:45.374508+02:00"
}

Case 6. Add parts of Django Request to JSON Output

config.yaml

version: 1

root:
  handlers:
    - console
  level: DEBUG
  propagate: True

filters:
  isotime:
    (): logging_utilities.filters.TimeAttribute
  django:
    (): logging_utilities.filters.django_request.JsonDjangoRequest
    include_keys:
      - request.path
      - request.method
      - request.headers

formatters:
  json:
    class: logging_utilities.formatters.json_formatter.JsonFormatter
    format:
      time: isotime
      level: levelname
      logger: name
      module: module
      function: funcName
      process: process
      thread: thread
      request_path: request.path
      request_method: request.method
      request:
        # NOTE: django headers name are case sensitive
        header.accept: request.headers.Accept
        header.accept-encoding: request.headers.Accept-Encoding 
        header.accept_language: request.headers.Accept-Language 
      message: message

handlers:
  console:
    class: logging.StreamHandler
    formatter: json
    stream: ext://sys.stdout
    filters:
      - isotime
      - django

NOTE: This require to have django package installed otherwise it raises ImportError

Then in your python code use it as follow:

import logging
import logging.config

import yaml

from django.http import JsonResponse
from django.conf import settings
from django.test import RequestFactory


config = {}
with open('example-config.yaml', 'r') as fd:
    config = yaml.safe_load(fd.read())

logging.config.dictConfig(config)

logger = logging.getLogger('your_logger')

def my_page(request):
    answer = {'success': True}
    logger.info('My page requested', extra={'request': request})
    return JsonResponse(answer)

settings.configure()
factory = RequestFactory()

my_page(factory.get(
    '/my_page?test=true', 
    HTTP_ACCEPT='*/*', 
    HTTP_ACCEPT_ENCODING='gzip', 
    HTTP_ACCEPT_LANGUAGE='en')
)

output:

{
  "time": "2022-07-20T12:29:19.536922+02:00",
  "level": "INFO",
  "logger": "your_logger",
  "module": "<stdin>",
  "function": "my_page",
  "process": 78479,
  "thread": 139751209555776,
  "request_path": "/my_page",
  "request_method": "GET",
  "request": {
    "header.accept": "*/*",
    "header.accept-encoding": "gzip",
    "header.accept_language": "en"
  },
  "message": "My page requested"
}

Case 7. Add all Log Extra as Dictionary to the Standard Formatter (including Django log extra)

config.yaml

version: 1

root:
  handlers:
    - console
  level: DEBUG
  propagate: True

filters:
  isotime:
    (): logging_utilities.filters.TimeAttribute
  django:
    (): logging_utilities.filters.django_request.JsonDjangoRequest
    include_keys:
      - request.path
      - request.method
      - request.headers
    exclude_keys:
      - request.headers.Authorization
      - request.headers.Proxy-Authorization

formatters:
  standard_extra:
    (): logging_utilities.formatters.extra_formatter.ExtraFormatter
    # NOTE also in the constructor the parameter is `fmt` we need to use `format` here
    format: "%(isotime)s - %(levelname)s - %(name)s - %(message)s"
    extra_fmt: " - extra:\n%s"
    extra_pretty_print: True
    pretty_print_kwargs:
      indent: 2
      width: 60

handlers:
  console:
    class: logging.StreamHandler
    formatter: standard_extra
    stream: ext://sys.stdout
    filters:
      - isotime
      - django

NOTE: This require to have django package installed otherwise it raises ImportError

Then in your python code use it as follow:

#!.venv/bin/python3
import logging
import logging.config

import yaml

from django.http import JsonResponse
from django.conf import settings
from django.test import RequestFactory


config = {}
with open('example-config.yaml', 'r') as fd:
    config = yaml.safe_load(fd.read())

logging.config.dictConfig(config)

logger = logging.getLogger('your_logger')

def my_page(request):
    answer = {'success': True}
    logger.info('My page requested', extra={'request': request, 'response': answer})
    return JsonResponse(answer)

settings.configure()
factory = RequestFactory()

my_page(factory.get('/my_page?test=true'))

output:

2020-11-19T13:32:58.942568+01:00 - INFO - your_logger - My page requested - extra:
{ 'request': { 'headers': {'Cookie': ''},
               'method': 'GET',
               'path': '/my_page'},
  'response': {'success': True}}

Case 8. Add Specific Log Extra to the Standard Formatter

config.yaml

version: 1

root:
  handlers:
    - console
  level: DEBUG
  propagate: True

formatters:
  standard_extra:
    (): logging_utilities.formatters.extra_formatter.ExtraFormatter
    # NOTE also in the constructor the parameter is `fmt` we need to use `format` here
    format: "%(asctime)s - %(levelname)s - %(name)s - %(message)s"
    extra_fmt: " - extra1=%(extra1)s"

handlers:
  console:
    class: logging.StreamHandler
    formatter: standard_extra
    stream: ext://sys.stdout

Then in your python code use it as follow:

#!.venv/bin/python3
import logging
import logging.config

import yaml

config = {}
with open('example-config.yaml', 'r') as fd:
    config = yaml.safe_load(fd.read())

logging.config.dictConfig(config)

logger = logging.getLogger('your_logger')

logger.debug('My log with extras', extra={'extra1': 23, 'extra2': "don't add this"})

output:

2020-11-19 13:42:29,424 - DEBUG - your_logger - My log with extras - extra1=23

Version 2.x.x Breaking Changes

From version 1.x.x to version 2.x.x there is the following breaking change:

  • Flask Attribute filter do not set anymore missing Flask attribute to empty string ! So if you configure the Flask attribute you must make sure that all attribute specified in the attribute list, exists. Also if you use the filter on a logger outside of a Flask Request context, the logger will raise a ValueError exception due to the missing Flask Request attribute. To avoid this you can use the new LogRecordIgnoreMissing.

Credits

The JSON Formatter implementation has been inspired by MyColorfulDays/jsonformatter

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

logging-utilities-2.1.0.tar.gz (47.7 kB view hashes)

Uploaded Source

Built Distribution

logging_utilities-2.1.0-py3-none-any.whl (36.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page