Skip to main content

A Python package which supports global logfmt formatted logging.

Project description

Python Logfmter

pre-commit test python-3.7-3.8-3.9-3.10-3.11-3.12

Using the stdlib logging module and without changing a single logging call, logfmter supports global (first and third party) logfmt structured logging.

> logging.warn("user created", extra=user)

at=WARNING msg="user created" first_name=John last_name=Doe age=25

Table of Contents

  1. Why
  2. Install
  3. Usage
    1. Integration
    2. Configuration
    3. Extension
    4. Guides
  4. Development
    1. Required Software
    2. Getting Started
    3. Publishing

Why

  • enables both human and computer readable logs, recommended as a "best practice" by Splunk
  • formats all first and third party logs, you never have to worry about a library using a different logging format
  • simple to integrate into any existing application, requires no changes to existing log statements i.e. structlog

Install

$ pip install logfmter

Usage

This package exposes a single Logfmter class that can be integrated into the standard library logging system like any logging.Formatter.

Integration

basicConfig

import logging
from logfmter import Logfmter

handler = logging.StreamHandler()
handler.setFormatter(Logfmter())

logging.basicConfig(handlers=[handler])

logging.error("hello", extra={"alpha": 1}) # at=ERROR msg=hello alpha=1
logging.error({"token": "Hello, World!"}) # at=ERROR token="Hello, World!"

dictConfig

If you are using dictConfig, you need to consider your setting of disable_existing_loggers. It is enabled by default, and causes any third party module loggers to be disabled.

import logging.config

logging.config.dictConfig(
    {
        "version": 1,
        "formatters": {
            "logfmt": {
                "()": "logfmter.Logfmter",
            }
        },
        "handlers": {
            "console": {"class": "logging.StreamHandler", "formatter": "logfmt"}
        },
        "loggers": {"": {"handlers": ["console"], "level": "INFO"}},
    }
)

logging.info("hello", extra={"alpha": 1}) # at=INFO msg=hello alpha=1

Notice, you can configure the Logfmter by providing keyword arguments as dictionary items after "()":

...

    "logfmt": {
        "()": "logfmter.Logfmter",
        "keys": [...],
        "mapping": {...}
    }

...

fileConfig

Using logfmter via fileConfig is not supported, because fileConfig does not support custom formatter initialization. There may be some hacks to make this work in the future. Let me know if you have ideas or really need this.

Configuration

keys

By default, the at=<levelname> key/value will be included in all log messages. These default keys can be overridden using the keys parameter. If the key you want to include in your output is represented by a different attribute on the log record, then you can use the mapping parameter to provide that key/attribute mapping.

Reference the Python logging.LogRecord Documentation for a list of available attributes.

import logging
from logfmter import Logfmter

formatter = Logfmter(keys=["at", "processName"])

handler = logging.StreamHandler()
handler.setFormatter(formatter)

logging.basicConfig(handlers=[handler])

logging.error("hello") # at=ERROR processName=MainProceess msg=hello

mapping

By default, a mapping of {"at": "levelname"} is used to allow the at key to reference the log record's levelname attribute. You can override this parameter to provide your own mappings.

import logging
from logfmter import Logfmter

formatter = Logfmter(
    keys=["at", "process"],
    mapping={"at": "levelname", "process": "processName"}
)

handler = logging.StreamHandler()
handler.setFormatter(formatter)

logging.basicConfig(handlers=[handler])

logging.error("hello") # at=ERROR process=MainProceess msg=hello

datefmt

If you request the asctime attribute (directly or through a mapping), then the date format can be overridden through the datefmt parameter.

import logging
from logfmter import Logfmter

formatter = Logfmter(
    keys=["at", "when"],
    mapping={"at": "levelname", "when": "asctime"},
    datefmt="%Y-%m-%d"
)

handler = logging.StreamHandler()
handler.setFormatter(formatter)

logging.basicConfig(handlers=[handler])

logging.error("hello") # at=ERROR when=2022-04-20 msg=hello

Extension

You can subclass the formatter to change its behavior.

import logging
from logfmter import Logfmter


class CustomLogfmter(Logfmter):
    """
    Provide a custom logfmt formatter which formats
    booleans as "yes" or "no" strings.
    """

    @classmethod
    def format_value(cls, value):
        if isinstance(value, bool):
            return "yes" if value else "no"

	return super().format_value(value)

handler = logging.StreamHandler()
handler.setFormatter(CustomLogfmter())

logging.basicConfig(handlers=[handler])

logging.error({"example": True}) # at=ERROR example=yes

Guides

Default Key/Value Pairs

Instead of providing key/value pairs at each log call, you can override the log record factory to provide defaults:

_record_factory = logging.getLogRecordFactory()

def record_factory(*args, **kwargs):
    record = _record_factory(*args, **kwargs)
    record.trace_id = 123
    return record

logging.setLogRecordFactory(record_factory)

This will cause all logs to have the trace_id=123 pair regardless of including trace_id in keys or manually adding trace_id to the extra parameter or the msg object.

Development

Required Software

Refer to the links provided below to install these development dependencies:

Getting Started

Setup

$ <runtimes.txt xargs -n 1 pyenv install -s
$ direnv allow
$ pip install -r requirements/dev.txt
$ pre-commit install
$ pip install -e .

Tests

Run the test suite against the active python environment.

$ pytest

Run the test suite against the active python environment and watch the codebase for any changes.

$ ptw

Run the test suite against all supported python versions.

$ tox

Publishing

Create

  1. Update the version number in logfmter/__init__.py.

  2. Add an entry in HISTORY.md.

  3. Commit the changes, tag the commit, and push the tags:

    $ git commit -am "v<major>.<minor>.<patch>"
    $ git tag v<major>.<minor>.<patch>
    $ git push origin main --tags
    
  4. Convert the tag to a release in GitHub with the history entry as the description.

Build

$ python -m build

Upload

$ twine upload dist/*

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

logfmter-0.0.8.tar.gz (9.8 kB view details)

Uploaded Source

Built Distribution

logfmter-0.0.8-py3-none-any.whl (7.5 kB view details)

Uploaded Python 3

File details

Details for the file logfmter-0.0.8.tar.gz.

File metadata

  • Download URL: logfmter-0.0.8.tar.gz
  • Upload date:
  • Size: 9.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.10.0 readme-renderer/43.0 requests/2.32.2 requests-toolbelt/1.0.0 urllib3/2.2.1 tqdm/4.66.4 importlib-metadata/7.1.0 keyring/25.2.1 rfc3986/2.0.0 colorama/0.4.6 CPython/3.12.3

File hashes

Hashes for logfmter-0.0.8.tar.gz
Algorithm Hash digest
SHA256 545b259579d6ee1b2117bed305ba69e64fcdaa5fa13deff74eaa9de31a189416
MD5 123ced29536d7506649e176a74eb306a
BLAKE2b-256 70d77acef367e86046758021ac8a3f5e7e8edce9dc0982561f8de69b7cf71186

See more details on using hashes here.

File details

Details for the file logfmter-0.0.8-py3-none-any.whl.

File metadata

  • Download URL: logfmter-0.0.8-py3-none-any.whl
  • Upload date:
  • Size: 7.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.10.0 readme-renderer/43.0 requests/2.32.2 requests-toolbelt/1.0.0 urllib3/2.2.1 tqdm/4.66.4 importlib-metadata/7.1.0 keyring/25.2.1 rfc3986/2.0.0 colorama/0.4.6 CPython/3.12.3

File hashes

Hashes for logfmter-0.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 65ea19ee9b3646e075acade74a50f46ca6287bd94e474338d4305ba07763979b
MD5 63861ac71b779a2b2f6c20cf8dc72887
BLAKE2b-256 e47631103db27b16f4778e738b20f8895ff36cb233dd3e24e34455c3688c983c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page