Skip to main content

Timeseries API built on top of Redis

Project description

Redis Timeseries

Time series API built on top of Redis that can be used to store and query time series statistics. Multiple time granularities can be used to keep track of different time intervals.

https://img.shields.io/pypi/v/redis_timeseries.svg https://api.travis-ci.org/ryananguiano/python-redis-timeseries.svg?branch=master Documentation Status Updates

Install

To install Redis Timeseries, run this command in your terminal:

$ pip install redis_timeseries

Usage

To initialize the TimeSeries class, you must pass a Redis client to access the database. You may also override the base key for the time series.

>>> import redis
>>> client = redis.StrictRedis()
>>> ts = TimeSeries(client, base_key='my_timeseries')

To customize the granularities, make sure each granularity has a ttl and duration in seconds. You can use the helper functions for easier definitions.

>>> my_granularities = {
...     '1minute': {'ttl': hours(1), 'duration': minutes(1)},
...     '1hour': {'ttl': days(7), 'duration': hours(1)}
... }
>>> ts = TimeSeries(client, granularities=my_granularities)

.record_hit() accepts a key and an optional timestamp and increment count. It will record the data in all defined granularities.

>>> ts.record_hit('event:123')
>>> ts.record_hit('event:123', datetime(2017, 1, 1, 13, 5))
>>> ts.record_hit('event:123', count=5)

.record_hit() will automatically execute when execute=True. If you set execute=False, you can chain the commands into a single redis pipeline. You must then execute the pipeline with .execute().

>>> ts.record_hit('event:123', execute=False)
>>> ts.record_hit('enter:123', execute=False)
>>> ts.record_hit('exit:123', execute=False)
>>> ts.execute()

.get_hits() will query the database for the latest data in the selected granularity. If you want to query the last 3 minutes, you would query the 1minute granularity with a count of 3. This will return a list of tuples (bucket, count) where the bucket is the rounded timestamp.

>>> ts.get_hits('event:123', '1minute', 3)
[(datetime(2017, 1, 1, 13, 5), 1), (datetime(2017, 1, 1, 13, 6), 0), (datetime(2017, 1, 1, 13, 7), 3)]

.get_total_hits() will query the database and return only a sum of all the buckets in the query.

>>> ts.get_total_hits('event:123', '1minute', 3)
4

.scan_keys() will return a list of keys that could exist in the selected range. You can pass a search string to limit the keys returned. The search string should always have a * to define the wildcard.

>>> ts.scan_keys('1minute', 10, 'event:*')
['event:123', 'event:456']

Features

  • Multiple granularity tracking

  • Redis pipeline chaining

  • Key scanner

  • Easy to integrate with charting packages

  • Can choose either integer or float counting

  • Date bucketing with timezone support

Credits

Algorithm copied from tonyskn/node-redis-timeseries

This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.

History

0.1.8 (2017-07-25)

  • Fix bug in _round_time() method

0.1.7 (2017-07-25)

  • Fix bug in _round_time() method

0.1.6 (2017-07-25)

  • Add timezone so day buckets will start at midnight in the correct timezone

0.1.5 (2017-07-18)

  • Update default granularities

0.1.4 (2017-07-12)

  • Add float value capabilities

  • Add increase() and decrease() methods

  • Move get_hits() -> get_buckets() and get_total_hits() -> get_total()

0.1.3 (2017-03-30)

  • Remove six package

  • Clean up source file

0.1.2 (2017-03-30)

  • Make Python 3 compatible

  • Fix tox to make PyPy work

0.1.1 (2017-03-30)

  • Minor project file updates

0.1.0 (2017-03-30)

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

redis_timeseries-0.1.9.tar.gz (15.1 kB view hashes)

Uploaded Source

Built Distribution

redis_timeseries-0.1.9-py2.py3-none-any.whl (5.7 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page