Skip to main content

multi backend cache

Project description

cache supporting multiple backends (memory, redis). Synchronization library based on aiocache.

https://travis-ci.org/fanjindong/pycached.svg?branch=master https://codecov.io/gh/fanjindong/pycached/branch/master/graph/badge.svg https://badge.fury.io/py/pycached.svg https://img.shields.io/pypi/pyversions/pycached.svg https://api.codacy.com/project/badge/Grade/96f772e38e63489ca884dbaf6e9fb7fd https://img.shields.io/badge/code%20style-black-000000.svg

This library aims for simplicity over specialization. All caches contain the same minimum interface which consists on the following functions:

  • add: Only adds key/value if key does not exist.

  • get: Retrieve value identified by key.

  • set: Sets key/value.

  • multi_get: Retrieves multiple key/values.

  • multi_set: Sets multiple key/values.

  • exists: Returns True if key exists False otherwise.

  • increment: Increment the value stored in the given key.

  • delete: Deletes key and returns number of deleted items.

  • clear: Clears the items stored.

  • raw: Executes the specified command using the underlying client.

Installing

  • pip install pycached

  • pip install pycached[redis]

  • pip install pycached[msgpack]

Usage

Using a cache is as simple as

>>> from from pycached import Cache
>>> cache = Cache(Cache.MEMORY) # Here you can also use Cache.REDIS and Cache.MEMCACHED, default is Cache.MEMORY
>>> cache.set('key', 'value')
True
>>> cache.get('key')
'value'

Or as a decorator

import time

from collections import namedtuple

from pycached import cached, Cache, RedisCache
from pycached.serializers import PickleSerializer
# With this we can store python objects in backends like Redis!

Result = namedtuple('Result', "content, status")


@cached(ttl=10, cache=RedisCache, key="key", serializer=PickleSerializer(), port=6379, namespace="main")
def cached_call():
    print("Sleeping for three seconds zzzz.....")
    time.sleep(3)
    return Result("content", 200)


def run():
    cached_call()
    cached_call()
    cached_call()
    cache = Cache(Cache.REDIS, endpoint="127.0.0.1", port=6379, namespace="main")
    cache.delete("key")

if __name__ == "__main__":
    run()

The recommended approach to instantiate a new cache is using the Cache constructor. However you can also instantiate directly using pycached.RedisCache, pycached.SimpleMemoryCache.

You can also setup cache aliases so its easy to reuse configurations

from pycached import caches

# You can use either classes or strings for referencing classes
caches.set_config({
    'default': {
        'cache': "pycached.SimpleMemoryCache",
        'serializer': {
            'class': "pycached.serializers.StringSerializer"
        }
    },
    'redis_alt': {
        'cache': "pycached.RedisCache",
        'endpoint': "127.0.0.1",
        'port': 6379,
        'timeout': 1,
        'serializer': {
            'class': "pycached.serializers.PickleSerializer"
        },
        'plugins': [
            {'class': "pycached.plugins.HitMissRatioPlugin"},
            {'class': "pycached.plugins.TimingPlugin"}
        ]
    }
})


def default_cache():
    cache = caches.get('default')   # This always returns the SAME instance
    cache.set("key", "value")
    assert cache.get("key") == "value"


def alt_cache():
    cache = caches.create('redis_alt')   # This creates a NEW instance on every call
    cache.set("key", "value")
    assert cache.get("key") == "value"


def test_alias():
    default_cache()
    alt_cache()

    caches.get('redis_alt').delete("key")


if __name__ == "__main__":
    test_alias()

How does it work

Pycached provides 3 main entities:

  • backends: Allow you specify which backend you want to use for your cache. Currently supporting: SimpleMemoryCache, RedisCache using redis.

  • serializers: Serialize and deserialize the data between your code and the backends. This allows you to save any Python object into your cache. Currently supporting: StringSerializer, PickleSerializer, JsonSerializer, and MsgPackSerializer. But you can also build custom ones.

  • plugins: Implement a hooks system that allows to execute extra behavior before and after of each command.

If you are missing an implementation of backend, serializer or plugin you think it could be interesting for the package, do not hesitate to open a new issue.

docs/images/architecture.png

Those 3 entities combine during some of the cache operations to apply the desired command (backend), data transformation (serializer) and pre/post hooks (plugins). To have a better vision of what happens, here you can check how set function works in pycached:

docs/images/set_operation_flow.png

Amazing examples

In examples folder you can check different use cases:

Documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pycached-0.0.9.tar.gz (29.6 kB view details)

Uploaded Source

File details

Details for the file pycached-0.0.9.tar.gz.

File metadata

  • Download URL: pycached-0.0.9.tar.gz
  • Upload date:
  • Size: 29.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.20.1 setuptools/41.0.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.5

File hashes

Hashes for pycached-0.0.9.tar.gz
Algorithm Hash digest
SHA256 336721eefaf66f682531ed2e9a57b406a80b673d552e3e8a9fea33d22f5ffffb
MD5 b08a17b6cd3688b27c92bdb74eb60ea9
BLAKE2b-256 63b03598a3a1a946436d16503a373a971efd9c76f6bef59e64c1c0980aefe45b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page