Skip to main content
This is a pre-production deployment of Warehouse. Changes made here affect the production instance of PyPI (
Help us improve Python packaging - Donate today!

mongo database handler for python logging

Project Description

log4mongo-python is mongo database handler for python logging, part of project. log4mongo-python is using pymongo driver -


  • python 2.7+
  • pymongo 2.8+
  • mongo database

For more information see debian_requirements.txt and requirements.txt files.


Example handler python configuration:

import logging
from log4mongo.handlers import MongoHandler

logger = logging.getLogger('test')

Contextual information

It is possible to decorate you document with contextual information. There are tow approaches.

1.) approach

import logging
from log4mongo.handlers import MongoHandler

handler = MongoHandler(host='localhost')
logger = logging.getLogger('test')
logging.LoggerAdapter(logger, {'ip': ''}).info('test')

2.) approach

import logging
from log4mongo.handlers import MongoHandler

handler = MongoHandler(host='localhost')
logger = logging.getLogger('test')
logger.addHandler(handler)'test', extra={'ip': ''})

As you can see, second approach is more straightforward and there is no need to use LoggerAdapter.

Capped collections

Capped collections are fixed-size collections that support high-throughput operations that insert, retrieve, and delete documents based on insertion order. Capped collections work in a way similar to circular buffers: once a collection fills its allocated space, it makes room for new documents by overwriting the oldest documents in the collection.

Before switching to capped collections, read this document please:

This behaviour is disabled by default. You can enable this behaviour in constructor with capped=True:

import logging
from log4mongo.handlers import MongoHandler

handler = MongoHandler(host='localhost', capped=True)

Buffered handler

BufferedMongoHandler is a subclass of MongoHandler allowing to buffer log messages and write them all at once to the database. The goal is to avoid too many writes to the database, thus avoiding too frequent write-locks. Log message buffer flush happens when the buffer is full, when a critical log message is emitted, and also periodically. An early buffer flush can happen when a critical message is emitted. And in order to avoid messages to stay indefinitively in the buffer queue before appearing in database, a periodical flush happens every X seconds.

This periodical flush can also be deactivated with buffer_periodical_flush_timing=False, thus avoiding the timer thread to be created.

Buffer size is configurable, as well as the log level for early flush (default is logging.CRITICAL):

import logging
from log4mongo.handlers import BufferedMongoHandler

handler = BufferedMongoHandler(host='localhost',                          # All MongoHandler parameters are valid
                               buffer_size=100,                           # buffer size.
                               buffer_periodical_flush_timing=10.0,       # periodical flush every 10 seconds
                               buffer_early_flush_level=logging.CRITICAL) # early flush level

logger = logging.getLogger().addHandler(handler)


** Tested on evnironment **

  • Ubuntu 14.04
  • python >2.7.4
  • pymongo >2.8.3
  • mongod - db version v3.0.2
  • pytest

Running tests

Before you run the test you must start mongo database. You will do so by this command:

$ mongod --dbpath /tmp/

To run the test run command:

$ python test

See vagrant file to quickly setup the test environment.

Original Author

Current Maitainer

Oz Nahum Tiram

Release History

This version
History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


Download Files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, Size & Hash SHA256 Hash Help File Type Python Version Upload Date
(9.5 kB) Copy SHA256 Hash SHA256
Source None Oct 18, 2017

Supported By

Elastic Elastic Search Pingdom Pingdom Monitoring Dyn Dyn DNS Sentry Sentry Error Logging CloudAMQP CloudAMQP RabbitMQ Kabu Creative Kabu Creative UX & Design Google Google Cloud Servers Fastly Fastly CDN StatusPage StatusPage Statuspage DigiCert DigiCert EV Certificate