HTTP log monitoring
Project description
Monilog
This project allows watching and logging all HTTP traffic of a system, generating log in w3c log format, showing statistics about total requests, maximum hits, requests in a timespan and alerting when traffic is above a customizable threshold.
Requirements
- Python and preferably Linux.
Usage
-
You can execute
simulate.sh
to run a simulation of how this project works. The simulation setup is customisable, feel free to play with it. -
Make sure to install the package by runing :
python setup.py install
or
pip install monilog
- To run the monitoring in your own log file, run:
monitoring --file /path/to/your/file --threshold 10
- To customize the log generation, run:
log_generator --rates 9 11 8 --durations 150 150 150
With rates
being the number of requests per second for each step of the simulation and durations
being the durations of the corresponding simulation steps.
- To execute the tests , run :
nosetests --with-coverage --cover-package=monilog
Attention: The monitoring is stopped when no new logs are written to the log
file during MAX_IDLE_TIME
set by default to 2 minutes. This is added to manage stopping
the monitoring automatically, particularly when doing limited time simulations.
Future Improvements
This is a first working solution for http log monitoring. Many improvements can be added :
- Managing threaded access to the log file using cross-platform file locking. The current implementation is tested on Linux and may cause errors in Windows.
- Enhancing the display of the log analysis and statistics. For now, the monitoring results are written
to standard output and to a log file with the naming convention
simulation-<timestamp>.log
. A better setup would be to customize the GUI using npyscreen for instance. - It is also possible to build a live dashboard consuming
simulation-<timestamp>.log
data and mapping it to graphs. - Pushing alerting notifications by email or SMS to admins / owners of the monitored system.
- Adding more relevant statistics to the analysis of the website and handle timezone changes.
- Writing extensive unit and integration tests.
- For a higher scale of data requiring high availability in a production setup, a more robust solution would be indexing the logs in ElasticSearch and building Kibana dashboards, with a stream-processing platform such as Kafka.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file monilog-0.1.4.tar.gz
.
File metadata
- Download URL: monilog-0.1.4.tar.gz
- Upload date:
- Size: 6.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.30.0 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2e25f1f1977055c2c9767a2ddc0062cfc11f0e211a8c8a9060bca008a775067f |
|
MD5 | 9c99d6aa2f066d771857e609411b7a14 |
|
BLAKE2b-256 | 27510a36192947eac0eb2e146bdff9d951d23c38391e45020ea6aa4ebb006a35 |
File details
Details for the file monilog-0.1.4-py3-none-any.whl
.
File metadata
- Download URL: monilog-0.1.4-py3-none-any.whl
- Upload date:
- Size: 9.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.30.0 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3219e6dab625262bb1436cbe14040192a10e9c079c49b39139959eee12fe2ded |
|
MD5 | 08c402afa615f46d9c43e9841c3f15c3 |
|
BLAKE2b-256 | ea7d7194f031785053c300d185f13b827ce1b7e95a68265742d93d7e49d34416 |