Skip to main content

Put parsed Nginx access.log to Elasticsearch

Project description

Nginx access.log have to be formatted with this format:

log_format main_ext
    '$remote_addr $http_host $remote_user [$time_local] "$request" '
    '$status $body_bytes_sent "$http_referer" '
    '"$http_user_agent" "$http_x_forwarded_for" '
    'rt=$request_time ua="$upstream_addr" '
    'us="$upstream_status" ut="$upstream_response_time" '
    'ul="$upstream_response_length" '
    'cs=$upstream_cache_status';

Install

Install with pip:

pip install nginx2es

Features

  • Stable log record ID (hostname + file inode number + timestamp + file position). It makes possible to import log file more than once (adding some additional processing to nginx2es, or dropping a daily index containing only a half of records, etc) without creating a duplicate records.
  • Parse query params and split request uri path components to separate fields for complex log filtering / aggregations.
  • Optional use of the GeoIP database (requires the geoip module and the GeoIPCity.dat database file) - adds city and region_name fields.
  • Correctly parse log records containing information about multiple upstream responses.
  • The tail -F-like mode implemented with inotify.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
nginx2es-0.3.0-py2-none-any.whl (10.3 kB) Copy SHA256 hash SHA256 Wheel py2 Feb 5, 2018
nginx2es-0.3.0-py3-none-any.whl (10.3 kB) Copy SHA256 hash SHA256 Wheel py3 Feb 5, 2018
nginx2es-0.3.0.tar.gz (7.9 kB) Copy SHA256 hash SHA256 Source None Feb 5, 2018

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page