Skip to main content

This library provides custom logging for python including error handling and timing.

Project description

ONDEWO Logo

ONDEWO Logging

This is the logging package for ONDEWO products. It allows for easy integration with our EFK stack, and adds some useful features to the base python logging package (such as timing and exception handling), and handles GRPC error messages nicely.

Useage

To use this library, first pip install it:

pip install ondewo-logging

then import it into your project like so:

from ondewo.logging.logger import logger_console

Note

In order for logger to log module_name, docker_image_name, and git_repo_name one has to pass them as environment variables to the container where the logged service is running.

Decorators

A couple of decorators are included:

from ondewo.logging.decorators import Timer, timing, exception_handling, exception_silencing

The Timer class can be used as a context manager:

with Timer() as t:
  sleep(1)

or as a decorator:

@Timer()
def sleeptime():
  sleep(1)

and can be used with different messages or logging levels:

  • Logging level: @Timer(logger=logger_console.info)
  • Message: @Timer(message="MESSAGE WITH TIME {} {}"), @Timer(message="SIMPLER MESSAGE WITHOUT TIME")
  • Disable argument logging: @Timer(log_arguments=False)
  • Enable exception suppression: @Timer(supress_exceptions=True)

See the tests for detailed examples of how these work.

Timing is just an instance of the Timer class:

timing = Timer()

for backwards compatibility.

The exception_handling function is a decorator which will log errors nicely using the ondewo logging syntax (below). It will also log the inputs and outputs of the function. The exception_silencing function just shows the inputs and outputs and gets rid of the stacktrace, it can be useful for debugging. Finally, log_arguments will dump the inputs and outputs of a function into the logs.

Ondewo log format

The structure of the logs looks like this:

message: Dict[str, Any] = {
  "message": f"Here is the normal log, including relevant information such the magic number: {magic number}. These values are also added seperately below, either just with the variable name or some other relevant name. Finally, there are some tags to help with searching through the logs.",
  "magic_number": magic_number,
  "tags": ["magic", "number"]
}

Note on tags:

The tags allow for easy searching and grouping in kibana. They can be added in a somewhat ad-hoc manner by the programmer on the ground, though some (like 'timing') are standardised. Please talk to your project team lead for details.

Fluentd

Quickstart

  1. git clone https://github.com/ondewo/ondewo-logging-python
  2. make
  3. edit the fluentd config with the url and password of your elasticsearch host:
sed -i 's/<PASSWORD>/my_password/' './fluentd/conf/fluent.conf'
sed -i 's/<HOST>/my_elasticsearch_host/' './fluentd/conf/fluent.conf'
  1. run fluentd docker-compose -f fluentd/docker-compose.yaml up -d

You now have a fluentd message handler running on your machine. If you use the ondewo.logging library, your logs will be shipped to your elasticsearch server.

Fluentd Config

Per the fluentd/docker-compose.yaml, we map the configuration files and the logs into the fluentd image and open some ports. We also need to chown -R 100:"$GID" fluentd/log. That command should allow both you and fluentd to read the logs.

Beyond that, it is just a question of formatting the logs wherever they come from. Here is the example from the fluentd config that sends stuff to the fluentd stdout, so you can see the logs from all your images in the same place.

<source>
  @type forward
  port 24224
</source>

# py.console logging gets piped to stdout
<match py.console.**>
  @type stdout
  <format>
      @type ltsv
      delimiter_pattern :
      label_delimiter =
  </format>
</match>

In this conf, we recieve imput over a tcp connection, then dumps the output to stdout, so you can use that stream to watch log output via fluentd. The config is also set up to save all the logs locally, and ship them to a remote server.

Automatic Release Process

The entire process is automated to make development easier. The actual steps are simple:

TODOs in Pull Request before the release:

  • Update the Version number inside the Makefile

    • ! : Major and Minor Version Number must be the same for Client and API at all times

      example: API 2.9.0 --> Client 2.9.X

  • Check if RELEASE.md is up-to-date

  • Update the Version number inside the setup.py by using:

    make update_setup
    

TODOs after Pull Request was merged in:

  • Checkout master:
    git checkout master
    
  • Pull the new stuff:
    git pull
    
  • Release:
    make ondewo_release
    

The make ondewo_release command can be divided into 5 steps:

  • cloning the devops-accounts repository and extracting the credentials
  • creating and pushing the release branch
  • creating and pushing the release tag
  • creating the GitHub release
  • creating and pushing the new PyPi release

The variables for the GitHub Access Token, PyPi Username and Password are all inside of the Makefile, but the values are overwritten during make ondewo_release, because they are passed from the devops-accounts repo as arguments to the actual release command.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ondewo-logging-3.2.1.tar.gz (12.4 kB view details)

Uploaded Source

Built Distributions

ondewo_logging-3.2.1-py3-none-any.whl (17.5 kB view details)

Uploaded Python 3

ondewo_logging-3.2.1-py2.py3-none-any.whl (12.6 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file ondewo-logging-3.2.1.tar.gz.

File metadata

  • Download URL: ondewo-logging-3.2.1.tar.gz
  • Upload date:
  • Size: 12.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.13

File hashes

Hashes for ondewo-logging-3.2.1.tar.gz
Algorithm Hash digest
SHA256 3f256d73bf2c5093c1e9fd7b417b3d8669e99135f164cb30e61b781585e8aed4
MD5 05ce0f7a884e30639b7801fb8c6b663d
BLAKE2b-256 414746342baa33b8cc821fe489447855ac281bd3373a1f7d30c5a79e8854e3fb

See more details on using hashes here.

File details

Details for the file ondewo_logging-3.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for ondewo_logging-3.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 6efecbd8c49ef4e093f6be5b2dcf49bfaf232ac006d02b981e17eb61ab9bf795
MD5 0d7ae43e4e75d2d8336beac41c5a0941
BLAKE2b-256 5dc4ff5cfce5a9b04cfa0c19a5f0cb9c5e2dd883aeafd6a89f60cc23e4354462

See more details on using hashes here.

File details

Details for the file ondewo_logging-3.2.1-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for ondewo_logging-3.2.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 a1bc07d265e49b26f45d9299eeb8d800ea5c13ac6abc3a98b8766dd3de7370c6
MD5 1f6f3933b2a776b79cc166629b47d7ce
BLAKE2b-256 524c59154c0afd53a6c15cab1be3453da7f59a4fff66e3e250b546e8199554e7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page