A distributed redis cache library, which solves the Thundering Herd problem.
Project description
========
Overview
========
A distributed redis cache library, which solves the Thundering Herd problem.
* Free software: BSD license
Installation
============
::
pip install thundercache
Documentation
=============
https://python-thundercache.readthedocs.io/
Development
===========
To run the all tests run::
tox
Note, to combine the coverage data from all the tox environments run:
.. list-table::
:widths: 10 90
:stub-columns: 1
- - Windows
- ::
set PYTEST_ADDOPTS=--cov-append
tox
- - Other
- ::
PYTEST_ADDOPTS=--cov-append tox
Usage
=====
.. code-block:: python
# Distributed Lock
from thundercache import LockFactory, retry_command
from redis.sentinel import Sentinel
sentinel = Sentinel()
redis_sentinel_master_instance = retry_command(sentinel.master_for, "your_sentinel_service_name", socket_timeout=20)
locks = LockFactory(expires=720, timeout=10, redis=redis_sentinel_master_instance)
with locks('my_lock'):
# do stuff with a distributed redis lock across different processes and networks
# pretty cool right!
pass
# Local Redis Cache
from thundercache import SmartLocalRedisCacheFactory, BaseCacheMixin)
import time
lcached = SmartLocalRedisCacheFactory()
class MyClass(BaseCacheMixin):
@lcached("method", max_age=10, critical=2)
def method(self, n):
time.sleep(n)
return n*n
@lcached("somefunc', max_age=10, critical=2)
def somefunc(n):
time.sleep(n)
return n*n
mc = MyClass()
print mc.method(3)
# prints "9" after three seconds
print mc.method(3)
# prints "9" instantly
print somefunc(2)
# prints "4" after two seconds
print somefunc(2)
# prints 4
# Distributed Redis Cache
from thundercache import SmartRedisCacheFactory, retry_command
from redis.sentinel import Sentinel
sentinel = Sentinel()
cached = SmartRedisCacheFactory(sentinel, "your_sentinel_service_name")
# you can now use the @cached decorator in the same way you use @lcached
# Per process cache
from thundercache import BaseCache
class MyClass(BaseCacheMixin):
@BaseCache("mymethod", max_age=10)
def mymethod(self, n):
time.sleep(n)
return n*n
@BaseCache("otherfunc', max_age=10)
def otherfunc(n):
time.sleep(n)
return n*n
# You can also chain these decorators
@BaseCache('x', 10)
@cached('y', 60)
def funct_or_method(*args, **kwargs):
return None
::
Changelog
=========
0.1.2 (2017-02-23)
-----------------------------------------
* First release on PyPI.
Overview
========
A distributed redis cache library, which solves the Thundering Herd problem.
* Free software: BSD license
Installation
============
::
pip install thundercache
Documentation
=============
https://python-thundercache.readthedocs.io/
Development
===========
To run the all tests run::
tox
Note, to combine the coverage data from all the tox environments run:
.. list-table::
:widths: 10 90
:stub-columns: 1
- - Windows
- ::
set PYTEST_ADDOPTS=--cov-append
tox
- - Other
- ::
PYTEST_ADDOPTS=--cov-append tox
Usage
=====
.. code-block:: python
# Distributed Lock
from thundercache import LockFactory, retry_command
from redis.sentinel import Sentinel
sentinel = Sentinel()
redis_sentinel_master_instance = retry_command(sentinel.master_for, "your_sentinel_service_name", socket_timeout=20)
locks = LockFactory(expires=720, timeout=10, redis=redis_sentinel_master_instance)
with locks('my_lock'):
# do stuff with a distributed redis lock across different processes and networks
# pretty cool right!
pass
# Local Redis Cache
from thundercache import SmartLocalRedisCacheFactory, BaseCacheMixin)
import time
lcached = SmartLocalRedisCacheFactory()
class MyClass(BaseCacheMixin):
@lcached("method", max_age=10, critical=2)
def method(self, n):
time.sleep(n)
return n*n
@lcached("somefunc', max_age=10, critical=2)
def somefunc(n):
time.sleep(n)
return n*n
mc = MyClass()
print mc.method(3)
# prints "9" after three seconds
print mc.method(3)
# prints "9" instantly
print somefunc(2)
# prints "4" after two seconds
print somefunc(2)
# prints 4
# Distributed Redis Cache
from thundercache import SmartRedisCacheFactory, retry_command
from redis.sentinel import Sentinel
sentinel = Sentinel()
cached = SmartRedisCacheFactory(sentinel, "your_sentinel_service_name")
# you can now use the @cached decorator in the same way you use @lcached
# Per process cache
from thundercache import BaseCache
class MyClass(BaseCacheMixin):
@BaseCache("mymethod", max_age=10)
def mymethod(self, n):
time.sleep(n)
return n*n
@BaseCache("otherfunc', max_age=10)
def otherfunc(n):
time.sleep(n)
return n*n
# You can also chain these decorators
@BaseCache('x', 10)
@cached('y', 60)
def funct_or_method(*args, **kwargs):
return None
::
Changelog
=========
0.1.2 (2017-02-23)
-----------------------------------------
* First release on PyPI.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
thundercache-0.1.2.tar.gz
(12.5 kB
view details)
File details
Details for the file thundercache-0.1.2.tar.gz
.
File metadata
- Download URL: thundercache-0.1.2.tar.gz
- Upload date:
- Size: 12.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0da9e27e10c7143961bfe6e59269459e9b8daa577a013c63cfb2bc807414fdee |
|
MD5 | 4688ce2a5531f1298c571249ed5172b5 |
|
BLAKE2b-256 | 36fe162c458ea3c91a0a6beb3f1027256d61e59cdbfb0e0e972521e22a1b1009 |