Skip to main content

Timeseries API built on top of Redis

Project description

Redis Timeseries

Time series API built on top of Redis that can be used to store and query time series statistics. Multiple time granularities can be used to keep track of different time intervals. Documentation Status Updates


To install Redis Timeseries, run this command in your terminal:

$ pip install redis_timeseries


To initialize the TimeSeries class, you must pass a Redis client to access the database. You may also override the base key for the time series.

>>> import redis
>>> client = redis.StrictRedis()
>>> ts = TimeSeries(client, base_key='my_timeseries')

To customize the granularities, make sure each granularity has a ttl and duration in seconds. You can use the helper functions for easier definitions.

>>> my_granularities = {
...     '1minute': {'ttl': hours(1), 'duration': minutes(1)},
...     '1hour': {'ttl': days(7), 'duration': hours(1)}
... }
>>> ts = TimeSeries(client, granularities=my_granularities)

.record_hit() accepts a key and an optional timestamp and increment count. It will record the data in all defined granularities.

>>> ts.record_hit('event:123')
>>> ts.record_hit('event:123', datetime(2017, 1, 1, 13, 5))
>>> ts.record_hit('event:123', count=5)

.record_hit() will automatically execute when execute=True. If you set execute=False, you can chain the commands into a single redis pipeline. You must then execute the pipeline with .execute().

>>> ts.record_hit('event:123', execute=False)
>>> ts.record_hit('enter:123', execute=False)
>>> ts.record_hit('exit:123', execute=False)
>>> ts.execute()

.get_hits() will query the database for the latest data in the selected granularity. If you want to query the last 3 minutes, you would query the 1minute granularity with a count of 3. This will return a list of tuples (bucket, count) where the bucket is the rounded timestamp.

>>> ts.get_hits('event:123', '1minute', 3)
[(datetime(2017, 1, 1, 13, 5), 1), (datetime(2017, 1, 1, 13, 6), 0), (datetime(2017, 1, 1, 13, 7), 3)]

.get_total_hits() will query the database and return only a sum of all the buckets in the query.

>>> ts.get_total_hits('event:123', '1minute', 3)

.scan_keys() will return a list of keys that could exist in the selected range. You can pass a search string to limit the keys returned. The search string should always have a * to define the wildcard.

>>> ts.scan_keys('1minute', 10, 'event:*')
['event:123', 'event:456']


  • Multiple granularity tracking
  • Redis pipeline chaining
  • Key scanner
  • Easy to integrate with charting packages
  • Can choose either integer or float counting
  • Date bucketing with timezone support


Algorithm copied from tonyskn/node-redis-timeseries

This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.


0.1.6 (2017-07-25)

  • Add timezone so day buckets will start at midnight in the correct timezone

0.1.5 (2017-07-18)

  • Update default granularities

0.1.4 (2017-07-12)

  • Add float value capabilities
  • Add increase() and decrease() methods
  • Move get_hits() -> get_buckets() and get_total_hits() -> get_total()

0.1.3 (2017-03-30)

  • Remove six package
  • Clean up source file

0.1.2 (2017-03-30)

  • Make Python 3 compatible
  • Fix tox to make PyPy work

0.1.1 (2017-03-30)

  • Minor project file updates

0.1.0 (2017-03-30)

  • First release on PyPI.

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for redis-timeseries, version 0.1.6
Filename, size File type Python version Upload date Hashes
Filename, size redis_timeseries-0.1.6-py2.py3-none-any.whl (7.5 kB) File type Wheel Python version py2.py3 Upload date Hashes View
Filename, size redis_timeseries-0.1.6.tar.gz (15.6 kB) File type Source Python version None Upload date Hashes View

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring DigiCert DigiCert EV certificate Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page