Skip to main content

Greendeck logging package

Project description

Greendeck-Logging:

Greendeck ![logging]

Install from pip

https://pypi.org/project/greendeck-logging/

pip install greendeck-logging

This library can be used for save the critical log, message, info data for your internal services in elasticsearch and create a nice visualization through kibana. For creating visualization in kibana you can refer this documentation

if you don't have elasticsearch and kibana you can refer this blog

How to use?

import the library
from greendeck_logging import GdLogging
from greendeck_logging import GdLoggingStack

There are Two ways to save the log data, by bulk (GdLoggingStack) and one by one ( GdLogging)

If you want to save the log in memory as stack and send data after process completes, use GdLoggingStack. It saves the log messages in stack and send to elastic search on calling push method.

Variables to connect elasticsearch to store the log data

LOG_ECS_HOST =  < YOUR_LOG_ECS_HOST >
LOG_ECS_INDEX = < LOG_ECS_INDEX >
LOG_ECS_TYPE = < LOG_ECS_TYPE >
service_name = < service_name >

if elasticsearch requires username and password to connect, provide in LOG_ECS_HOST in a standard format like https://username:password@elasticsearch_host:port_no Eg : https://admin:admin123@127.0.0.0:9200/

Some Default Values are :

LOG_ECS_INDEX = gdlogging, LOG_ECS_TYPE = '_doc'

if you want save data in other index name instead of gdlogging , assign LOG_ECS_INDEX = "your index name"

GdLoggingStack : Send the bulk log data to elasticsearch

from greendeck_logging import GdLoggingStack
gdl_stack = GdLoggingStack(LOG_ECS_HOST = LOG_ECS_HOST,
                          service_name = service_name  
                          LOG_ECS_INDEX = "gdlogging",
                          LOG_ECS_TYPE = "_doc",
                          )

**default value: LOG_ECS_INDEX = 'gdlogging', LOG_ECS_TYPE = '_doc',

**LOG_ECS_TYPE and SERVICE_NAME are required field

Methods and their functionality

There are following function for capturing the events

  • For capturing Error message and info of the error due to which occur ,
    gdl_stack.error("your message", info = {}, value = 1)
    default, value = 1, info = {} # info can be dictionary

  • To save Debug message and send the object as info
    gdl_stack.debug("your debug message", info = {}, value = 1)

  • To increament the response status of the services like 200, 404 etc
    gdl_stack.counter_message(message, info = {}, value = 1)
    default, value = 1

  • To increment the message for a particular website ( It can be your unique identifier depending on requirement )
    gdl_stack.counter_website(website, message, info = {}, value = 1)
    default, value = 1

  • clear the stack
    gdl_stack.clear()

  • total element in stack
    gdl_stack.count()

  • show the first element in the stack
    gdl_stack.show_one()

  • show all elements in the stack
    gdl_stack.show()

  • push all elements to elastic search
    gdl_stack.push()

** Don't forget to push the element to elastic search at end of your stack, use gdl_stack.push() for update the data in elastic search

GdLogging : It updates one element at a time

from greendeck_logging import GdLogging
gdl = GdLogging(LOG_ECS_HOST = LOG_ECS_HOST,
                          service_name = service_name  
                          LOG_ECS_INDEX = "gdlogging",
                          LOG_ECS_TYPE = "_doc",
                          )

**default value: LOG_ECS_INDEX = 'gdlogging', LOG_ECS_TYPE = '_doc',

Usage and functionality

There are following function for capturing the events

  • For capturing Error message and info due to which occur
    gdl.error("your message", info = {}, value = 1)
    default, value = 1 , info = {}

  • For capturign Debug message
    gdl.debug("your message", info = {}, value = 1) default, value = 1

  • To increament the response status of the services like 200, 404 etc
    gdl.counter_message(message, info = {}, value = 1)
    default, value = 1

  • To increment the message for a particular website ( It can be your unique identifier depending on requirement )
    gdl.counter_website(website, message, info = {}, value = 1)
    default, value = 1

Example of the object which will save into the elastic search

     "_index" : "gd_logging",
        "_type" : "_doc",
        "_id" : "0TASCGsBFBI3I7B8680B",
        "_score" : 2.3312538,
        "_source" : {
          "service_name" : "class_service",
          "log_type" : "info",
          "created_at" : "2019-05-30T14:59:45.452035",
          "meta" : {
            "file_name" : "service.py",
            "user_name" : "user",
            "system_name" : "user-300-15ISK",
            "os_name" : "Linux"
          },
          "counter_status" : {  
            "message" : "response status",
            "counter_type" : "status",
            "counter_name" : "201",
            "counter_value" : 1
          }
        }
      }

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

greendeck-logging-0.1.4.tar.gz (5.0 kB view details)

Uploaded Source

Built Distribution

greendeck_logging-0.1.4-py3-none-any.whl (7.5 kB view details)

Uploaded Python 3

File details

Details for the file greendeck-logging-0.1.4.tar.gz.

File metadata

  • Download URL: greendeck-logging-0.1.4.tar.gz
  • Upload date:
  • Size: 5.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.3

File hashes

Hashes for greendeck-logging-0.1.4.tar.gz
Algorithm Hash digest
SHA256 8e5a15ff4a5aa421484360284c7c274433d67eaa9e648d145455c32bb182e094
MD5 c86beabdcc7bfe2f4c6e815d7fe17ea2
BLAKE2b-256 04fb21fdba15cb4a92eb886cd57009f748c9be2e0eb70ce912f40ab370a700d6

See more details on using hashes here.

Provenance

File details

Details for the file greendeck_logging-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: greendeck_logging-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 7.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.3

File hashes

Hashes for greendeck_logging-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 d21e0e00bd11e1413c050d464fac3f4a803c1c92bf87889f285ba6193bc5420c
MD5 9a24e9b8d47c00ec8030f179a7a20ab3
BLAKE2b-256 88f029f3e56e0e0f876f66ef81cec9e129fdc42ab1a67c3e11e365ffa14e562e

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page