Skip to main content

Logs-analyzer is a library containing functions that can help you extract usable data from logs.

Project description

Logs-analyzer

Logs-analyzer is a Python based library containing functions that can help you extract usable data from logs.

Status

Master: Codeship Status for ddalu5/logs-analyzer Build Status

Develop: Codeship Status for ddalu5/logs-analyzer Build Status

Quickstart

Support

Python 2: 2.7 Python 3: 3.6, 3.7

Install

using pip : pip install logs-analyzer

Code sample

from logs_analyzer.lib import LogsAnalyzer

nginx_logsanalyzer = LogsAnalyzer('nginx')
nginx_logsanalyzer.add_date_filter(minute='*', hour=22, day=4, month=5)
nginx_logsanalyzer.add_filter('192.10.1.1')
requests = nginx_logsanalyzer.get_requests()

Non-object functions

Function get_service_settings

Get default settings for the said service from the settings file, three type of logs are supported right now: nginx, apache2 and auth.

Parameters

service_name: service name (e.g. nginx, apache2...).

Return

Returns a dictionary containing the chosen service settings or None if the service doesn't exists.

Sample

nginx_settings = get_service_settings('nginx')

Function get_date_filter

Get the date's pattern that can be used to filter data from logs based on the parameters.

Parameters

settings: the target logs settings.

minute: default now, minutes or * to ignore.

hour: default now, hours or * to ignore.

day: default now, day of month.

month: default now, month number.

year: default now, year.

Sample

nginx_settings = get_service_settings('nginx')
date_pattern = get_date_filter(nginx_settings, 13, 13, 16, 1, 1989)
print(date_pattern)

Prints [16/Jan/1989:13:13

Function filter_data

Filter received data/file content and return the results.

Parameters

log_filter: string that will be used to filter data

data: data to be filtered (String) or None if the data will be loaded from a file.

filepath: filepath from where data will be loaded or None if the data has been passed as a parameter.

is_casesensitive: if the filter has to be case sensitive (default True).

is_regex: if the filter string is a regular expression (default False).

is_reverse: (boolean) invert selection.

Return

Returns filtered data (String).

Sample

nginx_settings = get_service_settings('nginx')
date_filter = get_date_filter(nginx_settings, '*', '*', 27, 4, 2016)
base_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
file_name = os.path.join(base_dir, 'logs-samples/nginx1.sample')
data = filter_data('192.168.5', filepath=file_name)
data = filter_data(date_filter, data=data)

Function get_web_requests

Analyze the web logs (Nginx & Apache2 for now) data and return list of requests formatted as the model (pattern) defined.

Parameters

data: (String) data to analyzed.

pattern: (Regular expression) used to extract requests.

date_pattern: (Regular expression or None) used to extract date elements to have ISO formatted dates.

date_keys: (List or None) list of extracted date elements placements.

Sample

apache2_settings = get_service_settings('apache2')
base_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
file_name = os.path.join(base_dir, 'logs-samples/apache1.sample')
data = filter_data('127.0.1.1', filepath=file_name)
requests = get_web_requests(data, apache2_settings['request_model'],
                            nginx_settings['date_pattern'], nginx_settings['date_keys'])

Function get_auth_requests

Analyze the Auth logs data and return list of requests formatted as the model (pattern) defined.

Parameters

data: (String) data to analyzed.

pattern: (Regular expression) used to extract requests.

date_pattern: (Regular expression or None) used to extract date elements to have ISO formatted dates.

date_keys: (List or None) list of extracted date elements placements.

Sample

auth_settings = get_service_settings('auth')
base_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
date_filter = get_date_filter(auth_settings, '*', 22, 4, 5)
file_name = os.path.join(base_dir, 'logs-samples/auth.sample')
data = filter_data('120.25.229.167', filepath=file_name)
data = filter_data(date_filter, data=data)
requests = get_auth_requests(data, auth_settings['request_model'],
                                     auth_settings['date_pattern'], auth_settings['date_keys'])

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

logs-analyzer-0.5.2.tar.gz (10.3 kB view details)

Uploaded Source

Built Distribution

logs_analyzer-0.5.2-py3-none-any.whl (11.0 kB view details)

Uploaded Python 3

File details

Details for the file logs-analyzer-0.5.2.tar.gz.

File metadata

  • Download URL: logs-analyzer-0.5.2.tar.gz
  • Upload date:
  • Size: 10.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.0.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.7

File hashes

Hashes for logs-analyzer-0.5.2.tar.gz
Algorithm Hash digest
SHA256 7e60eea4e5917a15c29cf4ccfc77b500696a51baa6bd50d02ceb2017b8f9e6d7
MD5 7c0e37f9eb805728756186dcdbbf0cc3
BLAKE2b-256 9c8c6d24d8602447153f04fa85fbd20830e7288910ad4874f70d75b8e1c1af2d

See more details on using hashes here.

File details

Details for the file logs_analyzer-0.5.2-py3-none-any.whl.

File metadata

  • Download URL: logs_analyzer-0.5.2-py3-none-any.whl
  • Upload date:
  • Size: 11.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.0.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.7

File hashes

Hashes for logs_analyzer-0.5.2-py3-none-any.whl
Algorithm Hash digest
SHA256 79030f1715fc38d1e3da642c2b3b1c66d414368ec62dcb8c08530f308677aef4
MD5 a1f994481810d28dfcd7dd6606ca3443
BLAKE2b-256 2b487cc1a8e92de6f5180706e1e9623c7add717d9107aa4f85a8f4bb543988a7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page