Skip to main content

Logging handlers with GELF support

Project description


travis coveralls pypi downloads

Python logging handlers with GELF (Graylog Extended Log Format) support.

Currently TCP, UDP, TLS (encrypted TCP) and HTTP logging handlers are supported.

Get pygelf

pip install pygelf


from pygelf import GelfTcpHandler, GelfUdpHandler, GelfTlsHandler, GelfHttpHandler
import logging

logger = logging.getLogger()
logger.addHandler(GelfTcpHandler(host='', port=9401))
logger.addHandler(GelfUdpHandler(host='', port=9402))
logger.addHandler(GelfTlsHandler(host='', port=9403))
logger.addHandler(GelfHttpHandler(host='', port=9404))'hello gelf')

Message structure

According to the GELF spec, each message has the following mandatory fields:

  • version: ‘1.1’, can be overridden when creating a logger

  • short_message: the log message itself

  • timestamp: current timestamp

  • level: syslog-compliant log level number (e.g. WARNING will be sent as 4)

  • host: hostname of the machine that sent the message

  • full_message: this field contains stack trace and is being written ONLY when logging an exception, e.g.

except ZeroDivisionError as e:

In debug mode (when handler was created with debug=True option) each message contains some extra fields (which are pretty self-explanatory):

  • _file

  • _line

  • _module

  • _func

  • _logger_name


Each handler has the following parameters:

  • host: IP address of the GELF input

  • port: port of the GELF input

  • debug (False by default): if true, each log message will include debugging info: module name, file name, line number, method name

  • version (‘1.1’ by default): GELF protocol version, can be overridden

  • include_extra_fields (False by default): if true, each log message will include all the extra fields set to LogRecord

  • json_default (str with exception for several datetime objects): function that is called for objects that cannot be serialized to JSON natively by python. Default implementation is custom function that returns result of isoformat() method for datetime.datetime, datetime.time, objects and result of str(obj) call for other objects (which is string representation of an object with fallback to repr)

Also, there are some handler-specific parameters.


  • chunk_size (1300 by default) - maximum length of the message. If log length exceeds this value, it splits into multiple chunks (see section “chunked GELF”) with the length equals to this value. This parameter must be less than the MTU. If the logs don’t seem to be delivered, try to reduce this value.

  • compress (True by default) - if true, compress log messages before sending them to the server


  • validate (False by default) - if true, validate server certificate. If server provides a certificate that doesn’t exist in ca_certs, you won’t be able to send logs over TLS

  • ca_certs (None by default) - path to CA bundle file. This parameter is required if validate is true.

  • certfile (None by default) - path to certificate file that will be used to identify ourselves to the remote endpoint. This is necessary when the remote server has client authentication required. If certfile contains the private key, it should be placed before the certificate.

  • keyfile (None by default) - path to the private key. If the private key is stored in certfile this parameter can be None.


  • compress (True by default) - if true, compress log messages before sending them to the server

  • path (‘/gelf’ by default) - path of the HTTP input (

  • timeout (5 by default) - amount of seconds that HTTP client should wait before it discards the request if the server doesn’t respond

Static fields

If you need to include some static fields into your logs, simply pass them to the handler constructor. Each additional field should start with underscore. You can’t add field ‘_id’.


handler = GelfUdpHandler(host='', port=9402, _app_name='pygelf', _something=11)

Dynamic fields

If you need to include some dynamic fields into your logs, add them to record by using LoggingAdapter or logging.Filter and create handler with include_extra_fields set to True. All the non-trivial fields of the record will be sent to graylog2 with ‘_’ added before the name


class ContextFilter(logging.Filter):

    def filter(self, record):
        record.job_id = threading.local().process_id
        return True

handler = GelfUdpHandler(host='', port=9402, include_extra_fields=True)

Defining fields from environment

If you need to include some fields from the environment into your logs, add them to record by using additional_env_fields.

The following example will add an env field to the logs, taking its value from the environment variable FLASK_ENV.

handler = GelfTcpHandler(host='', port=9402, include_extra_fields=True, additional_env_fields={env: 'FLASK_ENV'})

The following can also be used in defining logging from configuration files (yaml/ini):




args=('', '12201')
kwargs={'include_extra_fields': True, 'debug': True, 'additional_env_fields': {'env': 'FLASK_ENV'}}



Running tests

To run tests, you’ll need tox. After installing, simply run it:


Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pygelf-0.4.2.tar.gz (12.0 kB view hashes)

Uploaded Source

Built Distribution

pygelf-0.4.2-py3-none-any.whl (8.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page