Skip to main content

LoggingHandler to move logs into gziped archives in a smart way

Project description

CompressedLogger

https://img.shields.io/pypi/v/compressedlogger.svg

This is a logging handler to be used with the standard python logging module. The handler creates a log file with the current logs and automatically moves old log files into zipped archives. You can set how many uncompressed logfiles you want to hold and what timespan they should cover. For example you can set each log file to cover three hours and keep four uncompressed logfiles. When the limit of uncompressed logfiles is reached, the oldest uncompressed file will be moved into a zipped archive file for that day. This way, all logentries for a single day are combined into one zipped file. You can also set a maximum of days to keep the archive files as well as a general size limit for archives. If either one of them is reached, the oldest archives are deleted.

Behaviour:

Using the following configuration: .. code-block:: python

compressed_handler = compressedlogger.CompressedLogger(log_path=”logs/”,

filename=”mylog”, header=”- - version: 1.2.34 - - -“, live_log_minutes=300, live_log_count=3, max_archive_size_mb=3, archive_days=2)

This will rotate the live log every 300 minutes. When started at 21.9.2020 - 10:33h, the first live log file will be named mylog-10_33.log. The first rotation will not be at 15:33, but at 15:00. Log rotation timestamp is calculated from 0:00h and not from the start of the application. Therefore, the next live log will be named mylog-15_00.log. So there will be the log files: mylog-10_33.log, mylog-15:00.log, mylog-20:00.log. Because the live logs rotate on day change, the next rotation will happen at 00:00h and not at 01:00h. Once there are more livelogs than in the max_live_logs configuration, the latest log will be moved into a compressed archive for that day. The header will be written to the top of every archived log. So in this example, there will be the log archive mylog2020-09-21.log.gz which will contain one logfile mylog2020-09-21.log with the content of [header + mylog-10_33.log + mylog-15_00.log + mylog-20_00.log].

Usage:

Parameters:

  • log_path: where your logs will be stored

  • filename: name of you logfile

  • live_log_minutes: timespan that is covered by a single live-log

  • live_log_count: maximum number of live logs to keep uncompressed

  • max_archive_size_mb: maximum combined size of archived logs, in megabyte

  • archive_days: maximum of days to keep log archives

  • header: this header will be written on top of every archived log

Credits

This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.

History

0.3.4 (2020-10-23)

  • when rotating a log file, read and write old logs chunkwise to lower RAM usage

0.3.3 (2020-10-23)

  • check for log rotation everytime a new log is opened

0.3.2 (2020-10-08)

  • timestamps as “10_00” instead of “10:00” for windows compatibility

0.3.1 (2020-09-21)

  • rework whole behaviour

0.2.9 (2020-09-10)

  • add new parameter ‘maximum_days’

  • update readme

0.2.8 (2020-07-28)

  • add missing support of logging formatters

  • fix the naming of logfiles if more than one handler writes into a log path

0.2.5 (2020-07-15)

  • updated project information and readme

0.2.0 (2020-07-15)

  • some improvements concerning the log paths and log file names

0.1.0 (2020-07-14)

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

compressedlogger-0.3.4.tar.gz (10.2 kB view hashes)

Uploaded Source

Built Distribution

compressedlogger-0.3.4-py2.py3-none-any.whl (6.6 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page