Skip to main content

Logging handlers with GELF support

Project description

pygelf

travis coveralls pypi downloads

Python logging handlers with GELF (Graylog Extended Log Format) support.

Currently TCP, UDP, TLS (encrypted TCP) and HTTP logging handlers are supported.

Get pygelf

pip install pygelf

Usage

from pygelf import GelfTcpHandler, GelfUdpHandler, GelfTlsHandler, GelfHttpHandler
import logging


logging.basicConfig(level=logging.INFO)
logger = logging.getLogger()
logger.addHandler(GelfTcpHandler(host='127.0.0.1', port=9401))
logger.addHandler(GelfUdpHandler(host='127.0.0.1', port=9402))
logger.addHandler(GelfTlsHandler(host='127.0.0.1', port=9403))
logger.addHandler(GelfHttpHandler(host='127.0.0.1', port=9404))

logger.info('hello gelf')

Message structure

According to the GELF spec, each message has the following mandatory fields:

  • version: ‘1.1’, can be overridden when creating a logger

  • short_message: the log message itself

  • timestamp: current timestamp

  • level: syslog-compliant log level number (e.g. WARNING will be sent as 4)

  • host: hostname of the machine that sent the message

  • full_message: this field contains stack trace and is being written ONLY when logging an exception, e.g.

try:
    1/0
except ZeroDivisionError as e:
    logger.exception(e)

In debug mode (when handler was created with debug=True option) each message contains some extra fields (which are pretty self-explanatory):

  • _file

  • _line

  • _module

  • _func

  • _logger_name

Configuration

Each handler has the following parameters:

  • host: IP address of the GELF input

  • port: port of the GELF input

  • debug (False by default): if true, each log message will include debugging info: module name, file name, line number, method name

  • version (‘1.1’ by default): GELF protocol version, can be overridden

  • include_extra_fields (False by default): if true, each log message will include all the extra fields set to LogRecord

  • json_default (str with exception for several datetime objects): function that is called for objects that cannot be serialized to JSON natively by python. Default implementation is custom function that returns result of isoformat() method for datetime.datetime, datetime.time, datetime.date objects and result of str(obj) call for other objects (which is string representation of an object with fallback to repr)

Also, there are some handler-specific parameters.

UDP:

  • chunk_size (1300 by default) - maximum length of the message. If log length exceeds this value, it splits into multiple chunks (see https://www.graylog.org/resources/gelf/ section “chunked GELF”) with the length equals to this value. This parameter must be less than the MTU. If the logs don’t seem to be delivered, try to reduce this value.

  • compress (True by default) - if true, compress log messages before sending them to the server

TLS:

  • validate (False by default) - if true, validate server certificate. If server provides a certificate that doesn’t exist in ca_certs, you won’t be able to send logs over TLS

  • ca_certs (None by default) - path to CA bundle file. This parameter is required if validate is true.

  • certfile (None by default) - path to certificate file that will be used to identify ourselves to the remote endpoint. This is necessary when the remote server has client authentication required. If certfile contains the private key, it should be placed before the certificate.

  • keyfile (None by default) - path to the private key. If the private key is stored in certfile this parameter can be None.

HTTP:

  • compress (True by default) - if true, compress log messages before sending them to the server

  • path (‘/gelf’ by default) - path of the HTTP input (http://docs.graylog.org/en/latest/pages/sending_data.html#gelf-via-http)

  • timeout (5 by default) - amount of seconds that HTTP client should wait before it discards the request if the server doesn’t respond

Static fields

If you need to include some static fields into your logs, simply pass them to the handler constructor. Each additional field should start with underscore. You can’t add field ‘_id’.

Example:

handler = GelfUdpHandler(host='127.0.0.1', port=9402, _app_name='pygelf', _something=11)
logger.addHandler(handler)

Dynamic fields

If you need to include some dynamic fields into your logs, add them to record by using LoggingAdapter or logging.Filter and create handler with include_extra_fields set to True. All the non-trivial fields of the record will be sent to graylog2 with ‘_’ added before the name

Example:

class ContextFilter(logging.Filter):

    def filter(self, record):
        record.job_id = threading.local().process_id
        return True

logger.addFilter(ContextFilter())
handler = GelfUdpHandler(host='127.0.0.1', port=9402, include_extra_fields=True)
logger.addHandler(handler)

Defining fields from environment

If you need to include some fields from the environment into your logs, add them to record by using additional_env_fields.

The following example will add an env field to the logs, taking its value from the environment variable FLASK_ENV.

handler = GelfTcpHandler(host='127.0.0.1', port=9402, include_extra_fields=True, additional_env_fields={env: 'FLASK_ENV'})
logger.addHandler(handler)

The following can also be used in defining logging from configuration files (yaml/ini):

[formatters]
keys=standard

[formatter_standard]
class=logging.Formatter
format=%(message)s

[handlers]
keys=graylog

[handler_graylog]
class=pygelf.GelfTcpHandler
formatter=standard
args=('127.0.0.1', '12201')
kwargs={'include_extra_fields': True, 'debug': True, 'additional_env_fields': {'env': 'FLASK_ENV'}}

[loggers]
keys=root

[logger_root]
level=WARN
handlers=graylog

Running tests

To run tests, you’ll need tox. After installing, simply run it:

tox

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pygelf-0.4.2.tar.gz (12.0 kB view details)

Uploaded Source

Built Distribution

pygelf-0.4.2-py3-none-any.whl (8.7 kB view details)

Uploaded Python 3

File details

Details for the file pygelf-0.4.2.tar.gz.

File metadata

  • Download URL: pygelf-0.4.2.tar.gz
  • Upload date:
  • Size: 12.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.5

File hashes

Hashes for pygelf-0.4.2.tar.gz
Algorithm Hash digest
SHA256 d0bb8f45ff648a9a187713f4a05c09f685fcb8add7b04bb7471f20071bd11aad
MD5 0e83829cddb7764203eab19832e29b00
BLAKE2b-256 fed373d1fe74a156f9a0e519bedc87815ed309e64af19c73b94352e4c0959ddb

See more details on using hashes here.

File details

Details for the file pygelf-0.4.2-py3-none-any.whl.

File metadata

  • Download URL: pygelf-0.4.2-py3-none-any.whl
  • Upload date:
  • Size: 8.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.5

File hashes

Hashes for pygelf-0.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ab57d1b26bffa014e29ae645ee51d2aa2f0c0cb419c522f2d24a237090b894a1
MD5 2e2a9738e3ec603a42589a35e06e6538
BLAKE2b-256 03cd4afdddbc73f54ddf31d16137ef81c3d47192d75754b3115d925926081fd6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page