A python library adding a json log formatter
Project description
Overview
This library is provided to allow standard python logging to output log data as json objects. With JSON we can make our logs more readable by machines and we can stop writing custom parsers for syslog type records.
News
Hi, I see this package is quiet alive and I am sorry for ignoring it so long. I will be stepping up my maintenance of this package so please allow me a week to get things back in order (and most likely a new minor version) and i'll post and update here once I am caught up.
Installing
Pip:
pip install python-json-logger
Pypi:
https://pypi.python.org/pypi/python-json-logger
Manual:
python setup.py install
Usage
Integrating with Python's logging framework
Json outputs are provided by the JsonFormatter logging formatter. You can add the custom formatter like below:
Please note: version 0.1.0 has changed the import structure, please update to the following example for proper importing
import logging
from pythonjsonlogger import jsonlogger
logger = logging.getLogger()
logHandler = logging.StreamHandler()
formatter = jsonlogger.JsonFormatter()
logHandler.setFormatter(formatter)
logger.addHandler(logHandler)
Customizing fields
The fmt parser can also be overidden if you want to have required fields that differ from the default of just message
.
These two invocations are equivalent:
class CustomJsonFormatter(jsonlogger.JsonFormatter):
def parse(self):
return self._fmt.split(';')
formatter = CustomJsonFormatter('one;two')
# is equivalent to:
formatter = jsonlogger.JsonFormatter('%(one)s %(two)s')
You can also add extra fields to your json output by specifying a dict in place of message, as well as by specifying an extra={}
argument.
Contents of these dictionaries will be added at the root level of the entry and may override basic fields.
You can also use the add_fields
method to add to or generally normalize the set of default set of fields, it is called for every log event. For example, to unify default fields with those provided by structlog you could do something like this:
class CustomJsonFormatter(jsonlogger.JsonFormatter):
def add_fields(self, log_record, record, message_dict):
super(CustomJsonFormatter, self).add_fields(log_record, record, message_dict)
if not log_record.get('timestamp'):
# this doesn't use record.created, so it is slightly off
now = datetime.utcnow().strftime('%Y-%m-%dT%H:%M:%S.%fZ')
log_record['timestamp'] = now
if log_record.get('level'):
log_record['level'] = log_record['level'].upper()
else:
log_record['level'] = record.levelname
formatter = CustomJsonFormatter('%(timestamp)s %(level)s %(name)s %(message)s')
Items added to the log record will be included in every log message, no matter what the format requires.
Adding custom object serialization
For custom handling of object serialization you can specify default json object translator or provide a custom encoder
def json_translate(obj):
if isinstance(obj, MyClass):
return {"special": obj.special}
formatter = jsonlogger.JsonFormatter(json_default=json_translate,
json_encoder=json.JSONEncoder)
logHandler.setFormatter(formatter)
logger.info({"special": "value", "run": 12})
logger.info("classic message", extra={"special": "value", "run": 12})
Using a Config File
To use the module with a config file using the fileConfig
function, use the class pythonjsonlogger.jsonlogger.JsonFormatter
. Here is a sample config file.
[loggers]
keys = root,custom
[logger_root]
handlers =
[logger_custom]
level = INFO
handlers = custom
qualname = custom
[handlers]
keys = custom
[handler_custom]
class = StreamHandler
level = INFO
formatter = json
args = (sys.stdout,)
[formatters]
keys = json
[formatter_json]
format = %(message)s
class = pythonjsonlogger.jsonlogger.JsonFormatter
Example Output
Sample JSON with a full formatter (basically the log message from the unit test). Every log message will appear on 1 line like a typical logger.
{
"threadName": "MainThread",
"name": "root",
"thread": 140735202359648,
"created": 1336281068.506248,
"process": 41937,
"processName": "MainProcess",
"relativeCreated": 9.100914001464844,
"module": "tests",
"funcName": "testFormatKeys",
"levelno": 20,
"msecs": 506.24799728393555,
"pathname": "tests/tests.py",
"lineno": 60,
"asctime": ["12-05-05 22:11:08,506248"],
"message": "testing logging format",
"filename": "tests.py",
"levelname": "INFO",
"special": "value",
"run": 12
}
External Examples
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file chili-python-json-logger-2.1.0.tar.gz
.
File metadata
- Download URL: chili-python-json-logger-2.1.0.tar.gz
- Upload date:
- Size: 9.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/3.10.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.6.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7a36459e6be07a37f1fc056ddcd5a09a7993de2ac78841e4da21e8a46cf4a73f |
|
MD5 | 697bbea65c2328033003a97fc295bf47 |
|
BLAKE2b-256 | 6a23ed04d13231678be895a40d49b6267cc3a4e11af24cdd94ffe9b9e14b58a6 |
File details
Details for the file chili_python_json_logger-2.1.0-py34-none-any.whl
.
File metadata
- Download URL: chili_python_json_logger-2.1.0-py34-none-any.whl
- Upload date:
- Size: 7.5 kB
- Tags: Python 3.4
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/3.10.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.6.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 45ca30a15ce0d436d29fb7b1718731de63a88712810ad839b6005cd99b5191f2 |
|
MD5 | 39e6d3d5d1a2c513428201b188af8634 |
|
BLAKE2b-256 | 425ac4578c3fed1caa26099760540a7bdb0ad6a107ea6d2e864a676ca1734874 |