Skip to main content

A Universal Time-Series Database Python Client

Project description

universal-tsdb

A Universal Time-Series Database Python Client (InfluxDB, Warp10, ...)

Introduction

This project aims to abstract your Time-Series backend, keeping your code as agnostic as possible.

Some examples:

  • proof of concept
  • early stages of development (when you are not sure which plateform you should use)
  • ETL (Extract-Transform-Load), for the load step

:warning: The current code only offer INGESTING functions (writing points to a backend).

Quickstart

Installation

$ pip install universal-tsdb
>>> from universal_tsdb import Client, Ingester
>>> backend = Client('influx', 'http://localhost:8086', database='test')
>>> series = Ingester(backend)
>>> series.append(1585934895000, measurement='data', field1=42.0)
>>> series.payload()
'data field1=42.0 1585934895000000000\n'
>>> series.commit()

InfluxDB

from universal_tsdb import Client, Ingester

backend = Client('influx', 'http://localhost:8086', database='metrics',
                 backend_username='user', backend_password='passwd')
series = Ingester(backend)
series.append(1585934895000, measurement='mes', field1=42.0)
series.append(1585934896000, measurement='mes', tags={'tag1':'value1'}, field1=43.4, field2='value')
series.commit()

The code above will generate a data payload based on InfluxDB line protocol and send it via a HTTP(S) request.

POST /write?db=metrics&u=user&p=passwd HTTP/1.1
Host: localhost:8086

mes field1=42.0 1585934895000000000
mes,tag1=value1 field1=43.4 field2="value" 1585934896000000000

Warp10

from universal_tsdb import Client, Ingester

backend = Client('warp10', 'http://localhost/api/v0', token='WRITING_TOKEN_ABCDEF0123456789')
series = Ingester(backend)
series.append(1585934895000, field1=42.0)
series.append(1585934896000, tags={'tag1':'value1'}, field1=43.4, field2='value')
series.commit()

The code above will generate a data payload based on Warp10 GTS format and send it via a HTTP(S) request.

POST /api/v0/update HTTP/1.1
Host: localhost
X-Warp10-Token: WRITING_TOKEN_ABCDEF0123456789

1585934895000000// field1{} 42.0
1585934896000000// field1{tag1=value1} 42.0
1585934896000000// field2{tag1=value1} 'value'

Advanced Usage

Batch processing

When you have a large volume of data to send, you may want to split in several HTTP requests. In 'batch'-mode, the library commit (send) the data automatically:

backend = Client('influx', 'http://localhost:8086', database='metrics')
series = Ingester(backend, batch=10)
for i in range(0..26):
    series.append(field=i)
series.commit() # final commit to save the last 6 values
Commit#1 Sent 10 new series (total: 10) in 0.02 s @ 2000.0 series/s (total execution: 0.13 s)
Commit#2 Sent 10 new series (total: 20) in 0.02 s @ 2000.0 series/s (total execution: 0.15 s)
Commit#3 Sent 6 new series (total: 26) in 0.01 s @ 2000.0 series/s (total execution: 0.17 s)
REPORT: 3 commits (3 successes), 26 series, 26 values in 0.17 s @ 2000.0 values/s",

Omitting Timestamp

If you omit timestamp, the library uses the function time.time() to generate a UTC Epoch Time. Precision is system dependent.

Measurement in Warp10

InfluxDB measurement does not exist in Warp10. The library emulates measurement by prefixing the Warp10 classname:

backend = Client('warp10', token='WRITING_TOKEN_ABCDEF0123456789')
series = Ingester(backend)
series.append(1585934895000, measurement='mes', field1=42.0) 
series.commit()
1585934896000000// mes.field1{} 42.0 

Todo

  • API documentation
  • Examples
  • Data query/fetch functions
  • Refactoring of backend specific code (inherited classes?)
  • Time-Series Line protocol optimization
  • Gzip/deflate HTTP compression
  • Code coverage / additional tests

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

universal_tsdb-0.1.1.tar.gz (9.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

universal_tsdb-0.1.1-py3-none-any.whl (21.8 kB view details)

Uploaded Python 3

File details

Details for the file universal_tsdb-0.1.1.tar.gz.

File metadata

  • Download URL: universal_tsdb-0.1.1.tar.gz
  • Upload date:
  • Size: 9.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.7.5

File hashes

Hashes for universal_tsdb-0.1.1.tar.gz
Algorithm Hash digest
SHA256 261d8e57d04c67e6b759f508b31f2505a9c07aa97067352fc05c4257aa8d8153
MD5 332853587ce3d66b16a50c809531015b
BLAKE2b-256 7928396d56e846e93b61c540ec0f74daed973671f34dc5610c63b37a44603361

See more details on using hashes here.

File details

Details for the file universal_tsdb-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: universal_tsdb-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 21.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.7.5

File hashes

Hashes for universal_tsdb-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 5489cbb6013c5c1dfadb67263562d25b5753d41ed7367dd69218308a63237d56
MD5 87d8acf74e89ae82c9beac2f854233c0
BLAKE2b-256 1c61160353acedd9d07cdc6b9f3eb70efe0f2e7851cc2e0ed2825d7b8383ad37

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page