A logging factory wrapper for loguru and elasticsearch
Project description
LOGGINGSFACTORY
Installation
pip install loggingsfactory
Usage
Import
from loggingsfactory.logging import Loggers
Initialization
Loguru
- Method 1
loggers = Loggers(appname="myapp")
- Method 2
loggers = Loggers(appname="myapp", debug=True)
- Method 3: not required, but can be done if auto switching of logger type is required for different environments
loggers = Loggers( appname="myapp", debug=True, host="https://elasticsearch.com:9201", index="appindex", username="user1" pw="userpw" )
Elasticsearch
loggers = Loggers(
appname="myapp",
debug=False,
host="https://elasticsearch.com:9201",
index="appindex",
username="user1"
pw="userpw"
)
AsyncElasticsearch
loggers = Loggers(
appname="myapp",
debug=False,
useasync=True,
host="https://elasticsearch.com:9201",
index="appindex",
username="user1"
pw="userpw"
)
Log usage
Loguru & Elasticsearch
-
Using default logging data
-
function name will be the function that is calling the log
def test_log(): loggers.log("info", "log data")
{ "log": "log data", "version": "1.0", "logger_level": "INFO", "functional_name": "test_log", "app_name": "myapp", "timestamp": "2020-01-01T00:00:00.000Z", }
-
-
Using custom function name
def test_log(): loggers.log("info", "log data", "somefunctionname")
{ "log": "log data", "version": "1.0", "logger_level": "INFO", "functional_name": "somefunctionname", "app_name": "myapp", "timestamp": "2020-01-01T00:00:00.000Z", } -
Using custom log data format
custom_log_data = { "custom_log": "this is a custom log" } def test_log(): loggers.log("info", custom_log_data, None, True)
{ "custom_log": "this is a custom log" } -
Using default logging data with custom date format
date = "2022/04/27" def test_log(): loggers.log("info", "log data", None, False, date)
{ "log": "log data", "version": "1.0", "logger_level": "INFO", "functional_name": "test_log", "app_name": "myapp", "timestamp": "2022/04/27", }
Loguru & AsyncElasticsearch
-
Using default logging data
-
function name will be the function that is calling the log
async def test_log(): loggers.async_log("info", "log data")
{ "log": "log data", "version": "1.0", "logger_level": "INFO", "functional_name": "test_log", "app_name": "myapp", "timestamp": "2020-01-01T00:00:00.000Z", }
-
-
Using custom function name
async def test_log(): loggers.async_log("info", "log data", "somefunctionname")
{ "log": "log data", "version": "1.0", "logger_level": "INFO", "functional_name": "somefunctionname", "app_name": "myapp", "timestamp": "2020-01-01T00:00:00.000Z", } -
Using custom log data format
custom_log_data = { "custom_log": "this is a custom log" } async def test_log(): loggers.async_log("info", custom_log_data, None, True)
{ "custom_log": "this is a custom log" } -
Using default logging data with custom date format
date = "2022/04/27" async def test_log(): loggers.async_log("info", "log data", None, False, date)
{ "log": "log data", "version": "1.0", "logger_level": "INFO", "functional_name": "test_log", "app_name": "myapp", "timestamp": "2022/04/27", }
Query usage
Elasticsearch
-
Using default query payload
def get_data(): return loggers.query()
-
Using custom query payload
custom_payload = { "query": { "bool": { "filter": [ { "bool": { "should": [{"match_phrase": {"app_name.keyword": "myapp"}}], "minimum_should_match": 1, } }, { "range": { "timestamp": { "gte": "2021-09-24T02:58:43.647Z", "lte": "2022-09-24T02:58:43.647Z", "format": "strict_date_optional_time", } } }, ] } } } def get_data(): return loggers.query(custom_payload)
AsyncElasticsearch
-
Using default query payload
async def get_data(): return await loggers.async_query()
-
Using custom query payload
custom_payload = { "query": { "bool": { "filter": [ { "bool": { "should": [{"match_phrase": {"app_name.keyword": "myapp"}}], "minimum_should_match": 1, } }, { "range": { "timestamp": { "gte": "2021-09-24T02:58:43.647Z", "lte": "2022-09-24T02:58:43.647Z", "format": "strict_date_optional_time", } } }, ] } } } async def get_data(): return await loggers.async_query(custom_payload)
SQL Query usage
Elasticsearch & AsyncElasticsearch
-
Using query payload
- supports only synchronous
query_statement = "SELECT * FROM appindex" def get_data(): return loggers.sql_query(query_statement)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file loggingsfactory-0.0.4.tar.gz.
File metadata
- Download URL: loggingsfactory-0.0.4.tar.gz
- Upload date:
- Size: 10.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.9.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
81f9c815a522e5c93eef54e5ec7f9694e27df15e2228956ae4bf5ebe0332f33f
|
|
| MD5 |
24abbedf216317ecf008f8eb2362c437
|
|
| BLAKE2b-256 |
acec58dff22952119da97e12f803ae58c046718bea65c4fbfd64a5c53e9fbc15
|
File details
Details for the file loggingsfactory-0.0.4-py3-none-any.whl.
File metadata
- Download URL: loggingsfactory-0.0.4-py3-none-any.whl
- Upload date:
- Size: 14.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.9.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7679781ae867c50bd6f7871583c6bd8d83efe595dcbcf2c7ba9538c3d291cc39
|
|
| MD5 |
ea68d29d907164b42b8dbcdc0fb8b93d
|
|
| BLAKE2b-256 |
550c93f4f361299291588fdafcecd4853eeb1501d73409e0cf6f95ac130c44df
|