Skip to main content

Log Analytics Powered by AI

Project description

Logparser

Python version Pypi version Downloads License

Logparser provides a machine learning toolkit and benchmarks for automated log parsing, which is a crucial step for structured log analytics. By applying logparser, users can automatically extract event templates from unstructured logs and convert raw log messages into a sequence of structured events. The process of log parsing is also known as message template extraction, log key extraction, or log message clustering in the literature.


An example of log parsing

🌈 New updates

Log parsers available:

Publication Parser Paper Reference
IPOM'03 SLCT A Data Clustering Algorithm for Mining Patterns from Event Logs, by Risto Vaarandi.
QSIC'08 AEL Abstracting Execution Logs to Execution Events for Enterprise Applications, by Zhen Ming Jiang, Ahmed E. Hassan, Parminder Flora, Gilbert Hamann.
KDD'09 IPLoM Clustering Event Logs Using Iterative Partitioning, by Adetokunbo Makanju, A. Nur Zincir-Heywood, Evangelos E. Milios.
ICDM'09 LKE Execution Anomaly Detection in Distributed Systems through Unstructured Log Analysis, by Qiang Fu, Jian-Guang Lou, Yi Wang, Jiang Li. [Microsoft]
MSR'10 LFA Abstracting Log Lines to Log Event Types for Mining Software System Logs, by Meiyappan Nagappan, Mladen A. Vouk.
CIKM'11 LogSig LogSig: Generating System Events from Raw Textual Logs, by Liang Tang, Tao Li, Chang-Shing Perng.
SCC'13 SHISO Incremental Mining of System Log Format, by Masayoshi Mizutani.
CNSM'15 LogCluster LogCluster - A Data Clustering and Pattern Mining Algorithm for Event Logs, by Risto Vaarandi, Mauno Pihelgas.
CNSM'15 LenMa Length Matters: Clustering System Log Messages using Length of Words, by Keiichi Shima.
CIKM'16 LogMine LogMine: Fast Pattern Recognition for Log Analytics, by Hossein Hamooni, Biplob Debnath, Jianwu Xu, Hui Zhang, Geoff Jiang, Adbullah Mueen. [NEC]
ICDM'16 Spell Spell: Streaming Parsing of System Event Logs, by Min Du, Feifei Li.
ICWS'17 Drain Drain: An Online Log Parsing Approach with Fixed Depth Tree, by Pinjia He, Jieming Zhu, Zibin Zheng, and Michael R. Lyu.
ICPC'18 MoLFI A Search-based Approach for Accurate Identification of Log Message Formats, by Salma Messaoudi, Annibale Panichella, Domenico Bianculli, Lionel Briand, Raimondas Sasnauskas.

:bulb: Welcome to submit a PR to push your parser code to logparser and add your paper to the table.

Installation

We recommend installing the logparser package and requirements via pypi.

pip install logpai==logparserv1.0.0

In particular, the package depends on the following requirements.

  • Python 3.6+
  • regex 2022.3.2
  • numpy
  • pandas
  • scipy
  • deap (if using logparser.MoLFI)
  • gcc (if using logparser.SLCT)

Get started

  1. Run the demo:

    For each log parser, we provide a demo to help you get started. Each demo shows the basic usage of a target log parser and the hyper-parameters to configure. For example, the following command shows how to run the demo for Drain.

    cd logparser/Drain
    python demo.py
    

    After finishing running the demo, you can obtain extracted event templates and parsed structured logs in the result folder.

  2. Run the benchmark:

    For each log parser, we provide a benchmark script to run log parsing on the loghub_2k datasets for evaluating parsing accuarcy. You can also use other benchmark datasets for log parsing.

    cd logparser/Drain 
    python benchmark.py
    
  3. Parse your own logs:

    It is easy to apply logparser to parsing your own log data. To do so, you need to install the logpai package first. Then you can develop your own script following the below code snippet to start log parsing.

    from logparser.Drain import LogParser
    
    input_dir = 'PATH_TO_LOGS/' # The input directory of log file
    output_dir = 'result/'  # The output directory of parsing results
    log_file = 'unknow.log'  # The input log file name
    log_format = '<Date> <Time> <Level>:<Content>' # Define log format to split message fields
    # Regular expression list for optional preprocessing (default: [])
    regex = [
        r'(/|)([0-9]+\.){3}[0-9]+(:[0-9]+|)(:|)' # IP
    ]
    st = 0.5  # Similarity threshold
    depth = 4  # Depth of all leaf nodes
    
    parser = LogParser(log_format, indir=input_dir, outdir=output_dir,  depth=depth, st=st, rex=regex)
    parser.parse(log_file)
    

    The full example is shown as example/parse_your_own_logs.py.

Production use

The main goal of logparser is used for research and benchmark purpose. Researchers can use logparser as a code base to develop new log parsers while practitioners could assess the performance and scalability of current log parsing methods through our benchmarking. We strongly recommend practitioners to try logparser in your production environment. But be aware that the current implementation of logparser is far from ready for production use. Whereas we currently have no plan to do that, we do have a few suggestions for developers who want to build an intelligent production-level log parser.

  • Please be aware of the licenses of third-party libraries used in logparser. We suggest to keep one parser and delete the others and then re-build the package wheel. This would not break the use of logparser.
  • Please enhance logparser with efficiency and scalability with multi-processing, add failure recovery, add persistence to disk or message queue Kafka.
  • Drain3 provides a good example for your reference that is built with [practical enhancements] for production scenarios.

Citation

👋 If you use our logparser tools or benchmarking results in your publication, please cite the following papers.

Discussion

Welcome to join our WeChat group for any question and discussion. Alternatively, you can open an issue here.

Scan QR code

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

logpai-1.0.0.tar.gz (71.3 kB view details)

Uploaded Source

Built Distribution

logpai-1.0.0-py3-none-any.whl (125.6 kB view details)

Uploaded Python 3

File details

Details for the file logpai-1.0.0.tar.gz.

File metadata

  • Download URL: logpai-1.0.0.tar.gz
  • Upload date:
  • Size: 71.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.9.6 requests/2.22.0 setuptools/45.2.0.post20200210 requests-toolbelt/0.10.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for logpai-1.0.0.tar.gz
Algorithm Hash digest
SHA256 0773aaae379fcc018880ca7319ffe76416147809b6bea606c5beca20852ca5f5
MD5 09fa7f81f1a01f6a5ef229c0abef6b49
BLAKE2b-256 4400cdcfa4ba4df0c9c9e510325c50d7c2f8e8c7b11e67e3c324d7cd4daecbfe

See more details on using hashes here.

File details

Details for the file logpai-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: logpai-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 125.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.9.6 requests/2.22.0 setuptools/45.2.0.post20200210 requests-toolbelt/0.10.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for logpai-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c5c7d77bc13823b1772ccbc0c7cd7c3babd5eedb1ce2a7845731a210f4cb7334
MD5 4a85fff5ebe13c0e923773dd6aa83901
BLAKE2b-256 319f0f1f56a75acb188bf7747ca0692eb228fa40779be07e06f8dbeee84ac5b5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page