Skip to main content

A logging boilerplate enhanced by the use of contextvars

Project description

Context Logger

  1. Description
  2. Requirements
  3. Installation
  4. Usage
  5. Tutorial
  6. Version Updates
  7. Contributions

Description

A simple logger that uses the contextvars library to inject contextual details in your logs.

Source Code: https://github.com/kolitiri/contextlogger

Requirements

Python 3.7+

Installation

pip install contextlogger

Usage

This is a bare minimum example to get you started. (Check the tutorial below for some more realistic scenarios)

""" my_app.py """
import asyncio
import logging
from uuid import uuid4

from contextlogger import CLogVars, CLogVar, getCLogger


# Create and configure a CLogger instance
clogger = getCLogger(__name__)
console_handler = logging.StreamHandler()
clogger.addHandler(console_handler)
clogger.setLevel('DEBUG')

# Create a CLogVars container with static or dynamic context variables
clogger.clogvars = CLogVars(
    static=CLogVar(name='static'),
    request_id=CLogVar(name='request_id', setter=lambda: str(uuid4())),
)

async def my_func():
    # Set the value of your static variable
    clogger.setvar('static', value=1)

    # Set the value of your dynamic variable
    clogger.setvar('request_id')

    clogger.info(f"Hello World!")


async def main():
    await asyncio.gather(my_func())

if __name__ == '__main__':
    asyncio.run(main())

Output

{'static': 1, 'request_id': '7e643fe2-bc7a-498c-a0fe-66ae58c671da'} Hello World!

Tutorial

This should be as simple as it gets!

Let's assume that we have a package with the following structure:

my_app/
│
├── my_app/
│   ├── __init__.py
│   ├── runner.py
├── main.py

where your main.py is simply running the two tasks defined in your runner.py module.

""" main.py """
import asyncio

from my_app.runner import task1, task2


async def main():
	await asyncio.gather(
		task1(),
		task2(),
	)
asyncio.run(main())
""" runner.py """
async def task1():
	pass

async def task2():
	pass

Logging configuration

In the __init__.py module of your project setup your logger. This will be very similar to the way you would normally configure a regular logger from the standard library.

""" __init__.py """
import logging
from logging.handlers import TimedRotatingFileHandler
import os

import contextlogger

# Create a CLogger instance
clogger = contextlogger.getCLogger(__name__)

# The the logging level
clogger.setLevel('DEBUG')

# Create a logging formatter
logging_format = "%(asctime)s %(levelname)s %(name)s %(message)s"
formatter = logging.Formatter(logging_format)

# Create handlers for console logger
console_handler = logging.StreamHandler()
console_handler.setFormatter(formatter)
clogger.addHandler(console_handler)

# Create handlers for file logger
LOG_DIR = 'logs'
APP = 'MY-APP'
if not os.path.exists(LOG_DIR):
	os.makedirs(LOG_DIR)

file_handler = TimedRotatingFileHandler(f"{LOG_DIR}/{APP}.log", when="midnight", interval=1)
file_handler.setFormatter(formatter)
file_handler.suffix = "%Y%m%d"
clogger.addHandler(file_handler)

So far, the only thing that we have done differently is that instead of using the getLogger function of the standard logging library, we used the getCLogger function from the contextlogger library.

Logging without context

Once your logger configuration is set, you can use your logger in your runner.py file (or whatever file you choose as your entry point)

""" runner.py """
from my_app import clogger

async def task1():
	clogger.info(f"Hello from {task1.__name__}")

async def task2():
	clogger.info(f"Hello from {task2.__name__}")

As expected, if you run your main.py the output of the clogger will be:

2020-12-06 18:07:27,008 INFO my_app Hello from task1
2020-12-06 18:07:27,009 INFO my_app Hello from task2

Logging with static context

Now lets add some context to our logging.

We can do that by adding a custom UserDict (CLogVars) of CLogVar (context log variables) to our logger.

Let's add a 'static' attribute... Not very useful but why not!

""" runner.py """
from contextlogger import CLogVars, CLogVar
from my_app import clogger

clogger.clogvars = CLogVars(
	static=CLogVar(name='static'),
)

async def task1():
    # Set our 'static' value for task1
	clogger.setvar('static', value=1)

	clogger.info(f"Hello from {task1.__name__}")

async def task2():
    # Set our 'static' value for task2
	clogger.setvar('static', value=2)

	clogger.info(f"Hello from {task2.__name__}")

And voila! Now the output of the clogger includes our static values:

2020-12-06 18:12:07,505 INFO my_app {'static': 1} Hello from task1
2020-12-06 18:12:07,505 INFO my_app {'static': 2} Hello from task2

Logging with dynamic context

Ok but that was not very handy, right? Let's do something more realistic. Let's pretend that this is our FastApi application and we want to add a 'request_id' to every request we receive.

Now, things get interesting! Our CLogVar can also accept a 'setter' argument which is a function that generates a new uuid every time we enter a new context. Every time we call the setvar without a value, it will try to find a setter to do the job!

from uuid import uuid4

from contextlogger import CLogVars, CLogVar
from my_app import clogger

clogger.clogvars = CLogVars(
	static=CLogVar(name='static'),
	request_id=CLogVar(name='request_id', setter=lambda: str(uuid4())),
)

async def task1():
	# Set our 'static' value for task1
	clogger.setvar('static', value=1)

	# Set our request_id value for task1
	clogger.setvar('request_id')

	clogger.info(f"Hello from {task1.__name__}")

async def task2():
    # Set our 'static' value for task2
	clogger.setvar('static', value=2)

	# Set our 'request_id' value for task2
	clogger.setvar('request_id')

	clogger.info(f"Hello from {task2.__name__}")

And here we are, with something a lot more useful than just a static value:

2020-12-06 18:21:17,626 INFO my_app {'request_id': 'd3961bd9-f701-4222-ad32-f204e9eb968a', 'static': 1} Hello from task1
2020-12-06 18:21:17,626 INFO my_app {'request_id': '6d4cdab2-e24b-481b-b54b-12c6ee9bcc1b', 'static': 2} Hello from task2

Logging with context in multiple modules

Finally, let's add an extra module just for the sake of it.

Now our directory will look like:

my_app/
│
├── my_app/
│   ├── __init__.py
│   ├── runner.py
│   ├── another_module.py
├── main.py

where our another_module.py simply imports the logger and uses it in a function:

from my_app import clogger


def test():
	clogger.info(f"Hello from another_module")

Now, if we call our test function inside the task1 of our runner.py:

from uuid import uuid4

from contextlogger import CLogVars, CLogVar
from my_app import clogger
from my_app.another_module import test

clogger.clogvars = CLogVars(
	static=CLogVar(name='static'),
	request_id=CLogVar(name='request_id', setter=lambda: str(uuid4())),
)

async def task1():
	# Set our 'static' value for task1
	clogger.setvar('static', value=1)

	# Set our request_id value for task1
	clogger.setvar('request_id')

	clogger.info(f"Hello from {task1.__name__}")

	test()

async def task2():
    # Set our 'static' value for task2
	clogger.setvar('static', value=2)

	# Set our 'request_id' value for task2
	clogger.setvar('request_id')

	clogger.info(f"Hello from {task2.__name__}")

we should see that the log line that is printed inside our another_module.py has the same request_id and static values with the log line that is printed in task1. And this is expected since they belong to the same context.

2020-12-06 18:33:36,634 INFO my_app {'request_id': '1eff5e40-4b05-4cd1-bd9c-edbee85d2995', 'static': 1} Hello from task1
2020-12-06 18:33:36,635 INFO my_app {'request_id': '1eff5e40-4b05-4cd1-bd9c-edbee85d2995', 'static': 1} Hello from another_module
2020-12-06 18:33:36,635 INFO my_app {'request_id': 'ec68779f-46f6-4ea0-a003-9ddb053887e1', 'static': 2} Hello from task2

Structured logging

If you prefer structured logging, you can create the clogger instance using the 'structured' argument, which will cause the message to be printed in a structured format.

Then, just change your formatter accordingly.

""" __init__.py """
import logging
from logging.handlers import TimedRotatingFileHandler
import os


import contextlogger

# Create a CLogger instance
clogger = contextlogger.getCLogger(__name__, structured=True)

# Create a logging formatter
logging_format = "{'time': '%(asctime)s', 'level': '%(levelname)s', 'name': '%(name)s', %(message)s}"
formatter = logging.Formatter(logging_format)

# Create handlers for console logger
console_handler = logging.StreamHandler()
console_handler.setFormatter(formatter)
clogger.addHandler(console_handler)

# Create handlers for file logger
LOG_DIR = 'logs'
APP = 'MY-APP'
if not os.path.exists(LOG_DIR):
	os.makedirs(LOG_DIR)

file_handler = TimedRotatingFileHandler(f"{LOG_DIR}/{APP}.log", when="midnight", interval=1)
file_handler.setFormatter(formatter)
file_handler.suffix = "%Y%m%d"
clogger.addHandler(file_handler)

The output result will become:

{'time': '2020-12-11 15:52:11,487', 'level': 'INFO', 'name': 'my_app', 'msg': 'Hello from task1', 'static': '1', 'request_id': 'cc75cb8f-f267-4406-b49c-fc2196a686c6'}
{'time': '2020-12-11 15:52:11,487', 'level': 'INFO', 'name': 'my_app', 'msg': 'Hello from another_module', 'static': '1', 'request_id': 'cc75cb8f-f267-4406-b49c-fc2196a686c6'}
{'time': '2020-12-11 15:52:11,487', 'level': 'INFO', 'name': 'my_app', 'msg': 'Hello from task2', 'static': '2', 'request_id': '7117cdb4-a0dd-4e12-89a5-756a03d7f8b1'}

Version Updates

1.0.0: Introduces breaking changes. CLogger.clogvars has been converted from a List to a custom UserDict to maintain consistency between get/set functionality.

Contributions

If you want to contribute to the package, please have a look at the CONTRIBUTING.md file for some basic instructions. Feel free to reach me in my email or my twitter account, which you can find in my github profile!

License

This project is licensed under the terms of the MIT license.

Authors

Christos Liontos

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

contextlogger-1.0.1.tar.gz (9.3 kB view details)

Uploaded Source

Built Distribution

contextlogger-1.0.1-py2-none-any.whl (7.1 kB view details)

Uploaded Python 2

File details

Details for the file contextlogger-1.0.1.tar.gz.

File metadata

  • Download URL: contextlogger-1.0.1.tar.gz
  • Upload date:
  • Size: 9.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/39.0.1 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.7.0

File hashes

Hashes for contextlogger-1.0.1.tar.gz
Algorithm Hash digest
SHA256 252c4a83473f9e209804c89a7cb6c6a2f209a38a3c46d0f6e5edccee2e98cc26
MD5 2b18688e021b787b868a9e38c6ada7ad
BLAKE2b-256 5d3593c7eb49b268f7de60083d6d9f7508ab0f3a61c51cc931fb326a514bfc7c

See more details on using hashes here.

File details

Details for the file contextlogger-1.0.1-py2-none-any.whl.

File metadata

  • Download URL: contextlogger-1.0.1-py2-none-any.whl
  • Upload date:
  • Size: 7.1 kB
  • Tags: Python 2
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/39.0.1 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.7.0

File hashes

Hashes for contextlogger-1.0.1-py2-none-any.whl
Algorithm Hash digest
SHA256 078ee17259957768e7ac3d376fd0ad128a89e9ba5ae499a50feb531c979d8479
MD5 1de3800ac28ff1b8e65b064134c4b592
BLAKE2b-256 0cca2f9f9926b9a65884680b657cea7ad178503ea5fe8847e77c549492730d89

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page