A few tools to cache interactions between your nameko services, increasing resiliency and performance at the expense of consistency, when it makes sense.
Project description
=============================
Nameko Cache Tools
=============================
.. image:: https://badge.fury.io/py/nameko-cachetools.png
:target: http://badge.fury.io/py/nameko-cachetools
.. image:: https://travis-ci.org/santiycr/nameko-cachetools.png?branch=master
:target: https://travis-ci.org/santiycr/nameko-cachetools
A few tools to cache interactions between your nameko services, increasing
resiliency and performance at the expense of consistency, when it makes sense.
To use Nameko Cache Tools in a project::
from nameko.rpc import rpc
from nameko_cachetools import CachedRpcProxy
class Service(object):
name = "demo"
other_service = CachedRpcProxy('other_service')
@rpc
def do_something(self, request):
# this rpc response will be cached first, then use the different
# cache strategies available in CachedRpcProxy or
# CacheFirstRpcProxy
other_service.do_something('hi')
Caching strategies:
-------------------
CachedRpcProxy
^^^^^^^^^^^^^^
If a cached version of this request exists, a response from the cache is
sent instead of hangling forever or raising an exception.
If a cached version doesn't exist, it will behave like a normal rpc,
and wait indefinitey for a reply. All successful replies are cached.
**WARNING**: Do NOT use this for setters, rpcs meant to modify state in the
target service
Arguments:
cache
the cache to use. This should resemble a dict but can be more
sophisticated, like the caches provided by the cachetools package.
failover_timeout
if a cached version of this query exists, how long
in seconds should your original request wait until it deems the target
service as unresponsive and moves on to use a cached response
CacheFirstRpcProxy
^^^^^^^^^^^^^^^^^^
Stores responses from the original services and keeps them cached.
If further requests come in with the same arguments and found in the cache,
a response from the cache is sent instead of hitting the destination service.
**WARNING**: Do NOT use this for setters, rpcs meant to modify state in the
target service
Arguments:
cache
the cache to use. This should resemble a dict but can be more
sophisticated, like the caches provided by the cachetools package.
Documentation
-------------
The full documentation is at http://nameko-cachetools.rtfd.org.
History
-------
0.1.0 (2018-06-10)
++++++++++++++++++
* First release on PyPI.
Nameko Cache Tools
=============================
.. image:: https://badge.fury.io/py/nameko-cachetools.png
:target: http://badge.fury.io/py/nameko-cachetools
.. image:: https://travis-ci.org/santiycr/nameko-cachetools.png?branch=master
:target: https://travis-ci.org/santiycr/nameko-cachetools
A few tools to cache interactions between your nameko services, increasing
resiliency and performance at the expense of consistency, when it makes sense.
To use Nameko Cache Tools in a project::
from nameko.rpc import rpc
from nameko_cachetools import CachedRpcProxy
class Service(object):
name = "demo"
other_service = CachedRpcProxy('other_service')
@rpc
def do_something(self, request):
# this rpc response will be cached first, then use the different
# cache strategies available in CachedRpcProxy or
# CacheFirstRpcProxy
other_service.do_something('hi')
Caching strategies:
-------------------
CachedRpcProxy
^^^^^^^^^^^^^^
If a cached version of this request exists, a response from the cache is
sent instead of hangling forever or raising an exception.
If a cached version doesn't exist, it will behave like a normal rpc,
and wait indefinitey for a reply. All successful replies are cached.
**WARNING**: Do NOT use this for setters, rpcs meant to modify state in the
target service
Arguments:
cache
the cache to use. This should resemble a dict but can be more
sophisticated, like the caches provided by the cachetools package.
failover_timeout
if a cached version of this query exists, how long
in seconds should your original request wait until it deems the target
service as unresponsive and moves on to use a cached response
CacheFirstRpcProxy
^^^^^^^^^^^^^^^^^^
Stores responses from the original services and keeps them cached.
If further requests come in with the same arguments and found in the cache,
a response from the cache is sent instead of hitting the destination service.
**WARNING**: Do NOT use this for setters, rpcs meant to modify state in the
target service
Arguments:
cache
the cache to use. This should resemble a dict but can be more
sophisticated, like the caches provided by the cachetools package.
Documentation
-------------
The full documentation is at http://nameko-cachetools.rtfd.org.
History
-------
0.1.0 (2018-06-10)
++++++++++++++++++
* First release on PyPI.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file nameko-cachetools-1.0.0.tar.gz
.
File metadata
- Download URL: nameko-cachetools-1.0.0.tar.gz
- Upload date:
- Size: 6.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6d1b5b03f78cf081862382f779086c9d0a4d4adaca2ba381ae9096ed7c0908c2 |
|
MD5 | 85395365371cd0076154f188912bbb96 |
|
BLAKE2b-256 | 4b8ce5a9ad2f475041ecc7e6be75dbd0e19e18ee113f218496e85b9b92306f24 |
File details
Details for the file nameko_cachetools-1.0.0-py2.py3-none-any.whl
.
File metadata
- Download URL: nameko_cachetools-1.0.0-py2.py3-none-any.whl
- Upload date:
- Size: 4.2 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4a94dcc3870837539f45addf152fa216ddeb249d79e1d37f73c14e50e2997cde |
|
MD5 | 8d9797f9f476a32e3cb04870cf26e9cf |
|
BLAKE2b-256 | da92c283c7cca334e396c6af8850163bca2eabfecc0f43982d7f441eda1a3cd8 |