Skip to main content

Log datamessages to influxdb

Project description

Log PubSubDataMessages (see datastreamcorelib) to InfluxDB, this is a pretty quick and dirty implementation and definitely not optimal. Supports only InfluxDB < 2.0 (ie 1.8) due to aioinflux not supporting 2.0.

For optimal write performance we should add a plugin system that allows one to convert the payload from the PubSubDataMessages into line-protocol decorated custom classes https://aioinflux.readthedocs.io/en/stable/usage.html#writing-user-defined-class-objects

The quick-and-dirty optimization is to batch writes into pandas dataframes, which has the not insignificant drawback of adding pandas/numpy to our requirements.

Docker

For more controlled deployments and to get rid of “works on my computer” -syndrome, we always make sure our software works under docker.

It’s also a quick way to get started with a standard development environment.

SSH agent forwarding

We need buildkit:

export DOCKER_BUILDKIT=1

And also the exact way for forwarding agent to running instance is different on OSX:

export DOCKER_SSHAGENT="-v /run/host-services/ssh-auth.sock:/run/host-services/ssh-auth.sock -e SSH_AUTH_SOCK=/run/host-services/ssh-auth.sock"

and Linux:

export DOCKER_SSHAGENT="-v $SSH_AUTH_SOCK:$SSH_AUTH_SOCK -e SSH_AUTH_SOCK"

Creating a development container

Build image, create container and start it:

docker build --ssh default --target devel_shell -t dsinfluxlogger:devel_shell .
docker create --name dsinfluxlogger_devel -p 58770:58770 -v `pwd`":/app" -it -v /tmp:/tmp `echo $DOCKER_SSHAGENT` dsinfluxlogger:devel_shell
docker start -i dsinfluxlogger_devel

pre-commit considerations

If working in Docker instead of native env you need to run the pre-commit checks in docker too:

docker exec -i dsinfluxlogger_devel /bin/bash -c "pre-commit install"
docker exec -i dsinfluxlogger_devel /bin/bash -c "pre-commit run --all-files"

You need to have the container running, see above. Or alternatively use the docker run syntax but using the running container is faster:

docker run -it --rm -v `pwd`":/app" dsinfluxlogger:devel_shell -c "pre-commit run --all-files"

Test suite

You can use the devel shell to run py.test when doing development, for CI use the “tox” target in the Dockerfile:

docker build --ssh default --target tox -t dsinfluxlogger:tox .
docker run -it --rm -v `pwd`":/app" `echo $DOCKER_SSHAGENT` dsinfluxlogger:tox

Production docker

There’s a “production” target as well for running the application (change “myconfig.toml” for config file):

docker build --ssh default --target production -t dsinfluxlogger:latest .
docker run -it --name dsinfluxlogger -v myconfig.toml:/app/config.toml -p 58770:58770 -it -v /tmp:/tmp `echo $DOCKER_SSHAGENT` dsinfluxlogger:latest

Local Development

TLDR:

  • Create and activate a Python 3.8 virtualenv (assuming virtualenvwrapper):

    mkvirtualenv -p `which python3.8` my_virtualenv
  • change to a branch:

    git checkout -b my_branch
  • install Poetry: https://python-poetry.org/docs/#installation

  • Install project deps and pre-commit hooks:

    poetry install
    pre-commit install
    pre-commit run --all-files
  • Ready to go, try the following:

    dsinfluxlogger --defaultconfig >config.toml
    dsinfluxlogger -vv config.toml

Remember to activate your virtualenv whenever working on the repo, this is needed because pylint and mypy pre-commit hooks use the “system” python for now (because reasons).

Running “pre-commit run –all-files” and “py.test -v” regularly during development and especially before committing will save you some headache.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dsinfluxlogger-0.5.0.tar.gz (6.9 kB view details)

Uploaded Source

Built Distribution

dsinfluxlogger-0.5.0-py3-none-any.whl (7.4 kB view details)

Uploaded Python 3

File details

Details for the file dsinfluxlogger-0.5.0.tar.gz.

File metadata

  • Download URL: dsinfluxlogger-0.5.0.tar.gz
  • Upload date:
  • Size: 6.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.5.0 importlib_metadata/4.8.2 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.12

File hashes

Hashes for dsinfluxlogger-0.5.0.tar.gz
Algorithm Hash digest
SHA256 85e33aa507db73aa11cc15434f2b78633e04b80f801cec18119b5b9f630474aa
MD5 44432dc3e78fc80c227b035aa46c9aeb
BLAKE2b-256 00cc292fc495780da5b8dc1a83775cecdf80a332e11029435bb7ec5ca9227335

See more details on using hashes here.

File details

Details for the file dsinfluxlogger-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: dsinfluxlogger-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 7.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.5.0 importlib_metadata/4.8.2 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.12

File hashes

Hashes for dsinfluxlogger-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5347fbff81916c79dd2bedb5c98bc458d9bbdacb845784c7c8efd98d52a17be4
MD5 4e6fd44132ec11fa71f9b174367c055c
BLAKE2b-256 ecebb169f8f682489624256d4d9f5a4f0fa7b748f90a16775dc346d0a2fe9fd5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page