Skip to main content

LogLead stands for Log Loader, Enhancer, and Anomaly Detector

Project description

LogLead

LogLead is designed to efficiently benchmark log anomaly detection algorithms and log representations.

Currently, it features nearly 1,000 unique anomaly detection combinations, encompassing 8 public datasets, 11 log representations (enhancers), and 11 classifiers. These resources enable you to benchmark your own data, log representation, or classifier against a diverse range of scenarios. LogLead is an actively evolving project, and we are continually adding new datasets, representations, and classifiers. If there's something you believe should be included, please submit a request for a dataset, enhancer, or classifier in the issue tracker.

A key strength of LogLead is its custom loader system, which efficiently isolates the unique aspects of logs from different systems. This design allows for a reduction in redundant code, as the same enhancement and anomaly detection code can be applied universally once the logs are loaded.

Installing LogLead

Simply install with pip:

python -m pip install loglead

NOTE: pip version does not have the tensorflow dependencies necessary for BertEmbeddings. Install them manually (preferably in a conda enviroment).

Known issues

  • If scikit-learn wheel fails to compile, check that you can gcc and g++ installed.

Demos

In the following demonstrations, you'll notice a significant aspect of LogLead's design efficiency: code reusability. Both demos, while analyzing different datasets, share a substantial amount of their underlying code. This not only showcases LogLead's versatility in handling various log formats but also its ability to streamline the analysis process through reusable code components.

Thunderbird Supercomputer Log Demo

  • Script: TB_samples.py
  • Description: This demo presents a Thunderbird supercomputer log, labeled at the line (event) level. A first column marked with “-” indicates normal behavior, while other markings represent anomalies.
  • Log Snapshot: View the log here.
  • Dataset: The demo includes a parquet file containing a subset of 263,408 log events, with 21,955 anomalies.
  • Screencast: For an overview of the demo, watch our 5-minute screencast on YouTube.

Hadoop Distributed File System (HDFS) Log Demo

  • Script: HDFS_samples.py
  • Description: This demo showcases logs from the Hadoop Distributed File System (HDFS), labeled at the sequence level (a sequence is a collection of multiple log events).
  • Log Snapshot: View the log here.
  • Anomaly Labels: Provided in a separate file.
  • Dataset: The demo includes a parquet file containing a subset of 222,579 log events, forming 11,501 sequences with 350 anomalies.

Example of Anomaly Detection results

Below you can see anomaly detection results (F1-Binary) trained on 0.5% subset of HDFS data. We use 5 different log message enhancement strategies: Words, Drain, LenMa, Spell, and BERT

The enhancement strategies are tested with 5 different machine learning algorithms: DT (Decision Tree), SVM (Support Vector Machine), LR (Logistic Regression), RF (Random Forest), and XGB (eXtreme Gradient Boosting).

Words Drain Lenma Spell Bert Average
DT 0.9719 0.9816 0.9803 0.9828 0.9301 0.9693
SVM 0.9568 0.9591 0.9605 0.9559 0.8569 0.9378
LR 0.9476 0.8879 0.8900 0.9233 0.5841 0.8466
RF 0.9717 0.9749 0.9668 0.9809 0.9382 0.9665
XGB 0.9721 0.9482 0.9492 0.9535 0.9408 0.9528
--------- -------- -------- -------- -------- -------- ---------
Average 0.9640 0.9503 0.9494 0.9593 0.8500

Functional overview

LogLead is composed of distinct modules: the Loader, Enhancer, and Anomaly Detector. We use Polars dataframes as its notably faster than Pandas.

Loader: This module reads in the log files and deals with the specifics features of each log file. It produces a dataframe with certain semi-mandatory fields. These fields enable actions in the subsequent stages. LogLead has loaders to the following public datasets from 10 different systems:

Enhancer: This module extracts additional data from logs. The enhancement takes place directly within the dataframes, where new columns are added as a result of the enhancement process. For example, log parsing, the creation of tokens from log messages, and measuring log sequence lengths are all considered forms of log enhancement. Enhancement can happen at the event level or be aggregated to the sequence level. Some of the enhancers available: Event Length (chracters, words, lines), Sequence Length, Sequence Duration, following "NLP" enhancers: Regex, Words, Character n-grams. Log parsers: Drain, LenMa, Spell, IPLoM, AEL, Brain, Fast-IPLoM, Tipping, and BERT. NextEventPrediction including its probablities and perplexity. Next event prediction can be computed on top of any of the parser output.

Anomaly Detector: This module uses the enhanced log data to perform Anomaly Detection. It is mainly using SKlearn at the moment but there are few customer algorithms as well. LogLead has been integrated and tested with following models:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

loglead-1.1.1.tar.gz (85.9 kB view details)

Uploaded Source

Built Distribution

LogLead-1.1.1-py3-none-any.whl (100.8 kB view details)

Uploaded Python 3

File details

Details for the file loglead-1.1.1.tar.gz.

File metadata

  • Download URL: loglead-1.1.1.tar.gz
  • Upload date:
  • Size: 85.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.7

File hashes

Hashes for loglead-1.1.1.tar.gz
Algorithm Hash digest
SHA256 5a2f36e1796ce536c586a210831c5828fa6b6bc7f2795c35dc0259da7c8c7c21
MD5 d7bff1c50902751dd5be75948a6891dc
BLAKE2b-256 bc70e3adbc253f8d23a04f9a9d2654d43bc8582a7a4e933d8bd2e2b9624f8391

See more details on using hashes here.

File details

Details for the file LogLead-1.1.1-py3-none-any.whl.

File metadata

  • Download URL: LogLead-1.1.1-py3-none-any.whl
  • Upload date:
  • Size: 100.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.7

File hashes

Hashes for LogLead-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 6ba54f94d80a9a58ff64b3ecb90b882a0fdd3e3ddaf0f690876a14145eb94dc3
MD5 6ed1f6144ee0d682691cf57a3b7b3311
BLAKE2b-256 f9889285b7adfbfa82637617345387a5eea8f8e5065a7bb18452f235f3ab81d8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page