Skip to main content

Python logging library with support for multiple destinations

Project description

loghandler

Lint Tests Downloads/month Bug reports

Easy logging package for all your logging needs.

Features

  • Log to multiple endpoints at once
  • Support for STDOUT, Elasticsearch, Database (MySQL, PostgreSQL, SQLite), and more coming soon.
  • Easy syntax
  • Fail-over reporting (If one endpoint fails it will be reported to the other endpoints)

Installing

Install loghandler via pip

pip install loghandler

Using

In your code import LogHandler and initalize it.

from loghandler import LogHandler

logger = LogHandler({
    "log_level": "DEBUG",
    "outputs": [
        {
            "type": "STDOUT"
        }
    ]
})

You can now log messages to all your outputs via:

logger.log('fatal', Exception("Something went HORRIBLY wrong"))

Endpoints

The following endpoints are currently in the works and will be supported soon.

  • logstash
  • sentry

General Configuration

All endpoints accept a few global settings. They are shown below.

log_level: For the output it's applied to, this will overrule the global configuration level

report_error: If an output fails to send a message, this error will be reported to the other outputs, by default this is True. (It's not recommended to turn this off.)

retry_after: Defines how long an output should wait before retrying, by default this is 15s (Defined in seconds)

STDOUT

To use STDOUT as a log endpoint, add the following to your outputs array.

{
    "type": "STDOUT"
}

Elasticsearch

To use elasticsearch as a log endpoint, add the following to your outputs array.

{
    "type": "elasticsearch", 
    "config": {
        "hosts": ["https://your-es-host.com:9243"],
        "ssl": True,
        "verify_certs": True,
        "refresh": "wait_for",  # Must be either "true", "false" or "wait_for"
        "index": "your-index",  # Index will be created if it doesn't exist
        "api_key": ("your-api-key-id", "your-api-key-secret")
    }
}

Next time something is logged you should see something like the following under your index:

{
  "_index" : "logs",
  "_type" : "_doc",
  "_id" : "some-id",
  "_score" : 1.0,
  "_source" : {
    "timestamp" : "2021-11-05T04:16:25.250206",
    "level" : "DEBUG",
    "hostname" : "YOUR-HOSTNAME",
    "message" : "division by zero",
    "occurred_at" : {
      "path" : "/somepath/test.py",
      "line" : 22
    }
  }
}

Database

Table Structure
Table(
    db_config["table_name"],
    metadata,
    Column("id", Integer, primary_key=True),
    Column("message", Text),
    Column("level", String),
    Column("origin", String),
    Column("timestamp", DateTime),
)

sqlite

To use sqlite as a log endpoint, add the following to your outputs array.

{
    "type": "sqlite", 
    "config": {
        "table_name": "logs",  # Will be created if it doesn't exist
        "db_path": "/path/to/db.sqlite",  # Will be created if it doesn't exist
    }
}

Next time something is logged you should see something like the following under your table:

('division by zero', 'DEBUG', '/somepath/test.py:31', '2021-11-07 01:27:24.755989')

mysql

To use mysql as a log endpoint, add the following to your outputs array.

{
    "type": "mysql",
    "config": {
        "table_name": "logs",
        "connection_string": "root:example@localhost:3306/example_db"
    }
}

Next time something is logged you should see something like the following under your table:

division by zero | DEBUG | /somepath/test.py:22 | 2021-11-07 01:46:58

pgsql (PostgreSQL)

To use pgsql as a log endpoint, add the following to your outputs array.

{
    "type": "pgsql",
    "config": {
        "table_name": "logs",
        "connection_string": "postgres:postgres@localhost:5432/example"
    }
}

Next time something is logged you should see something like the following under your table:

division by zero | DEBUG | /somepath/test.py:22 | 2021-11-07 01:46:58

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

loghandler-0.5.0.tar.gz (9.0 kB view details)

Uploaded Source

Built Distribution

loghandler-0.5.0-py3-none-any.whl (10.0 kB view details)

Uploaded Python 3

File details

Details for the file loghandler-0.5.0.tar.gz.

File metadata

  • Download URL: loghandler-0.5.0.tar.gz
  • Upload date:
  • Size: 9.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.10.0

File hashes

Hashes for loghandler-0.5.0.tar.gz
Algorithm Hash digest
SHA256 ec4b79ab253e7275762ecc7fe7731275bbeaebcf0c6d1e23d72b795e823fa9cc
MD5 c73ed91c6d91df84638bd8032da499a1
BLAKE2b-256 b6cec851076de689bcdfcd98a82eacb3a0bac2a7e218c53e892edcd47e224618

See more details on using hashes here.

File details

Details for the file loghandler-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: loghandler-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 10.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.10.0

File hashes

Hashes for loghandler-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 14a306a3ae72a99bb184e98c4dbb167de42260f6846240ef61c39bfde4f3f016
MD5 c34e8ca88a0e71d35831a99506d915eb
BLAKE2b-256 976012544fe62a16c9d2816abbb53fba627de8052bc34297603e3892fd5c32ca

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page