Skip to main content

Caching tool for python

Project description

Caching tool for python, working with Redis single instance and Redis cluster mode

PyPi link

Installation

 $ pip install cache-house 

or with poetry

poetry add cache-house

Quick Start


cache decorator work with async and sync functions

from cache_house.backends import RedisFactory
from cache_house.cache import cache
import asyncio

RedisFactory.init()

@cache() # default expire time is 180 seconds
async def test_cache(a: int,b: int):
    print("async cached")
    return [a,b]

@cache()
def test_cache_1(a: int, b: int):
    print("cached")
    return [a, b]


if __name__ == "__main__":
    print(test_cache_1(3,4))
    print(asyncio.run(test_cache(1,2)))

Check stored cache key

 $ rdcli KEYS "*"
1) cachehouse:main:8f65aed1010f0062a783c83eb430aca0
2) cachehouse:main:f665833ea64e4fc32653df794257ca06

Setup Redis cache instance


You can pass all redis-py arguments to RedisCache.init method and additional arguments :

def RedisFactory.init(
        host: str = "localhost",
        port: int = 6379,
        encoder: Callable[..., Any] = ...,
        decoder: Callable[..., Any] = ...,
        namespace: str = ...,
        key_prefix: str = ...,
        key_builder: Callable[..., Any] = ...,
        password: str = ...,
        db: int = ...,
        cluster_mode: bool =False
        **redis_kwargs
    )

or you can set your own encoder and decoder functions

from cache_house.backends import RedisFactory

def custom_encoder(data):
    return json.dumps(data)

def custom_decoder(data):
    return json.loads(data)

RedisFactory.init(encoder=custom_encoder, decoder=custom_decoder)

Default encoder and decoder is pickle module.


Setup Redis Cluster cache instance


All manipulation with RedisCache same with a RedisClusterCache

from cache_house.backends import RedisFactory
from cache_house.cache import cache

RedisFactory.init(cluster_mode=True)

@cache()
async def test_cache(a: int,b: int):
    print("cached")
    return [a,b]
def RedisFactory.init(   # for redis cluster
        cls,
        host="localhost",
        port=6379,
        encoder: Callable[..., Any] = pickle_encoder,
        decoder: Callable[..., Any] = pickle_decoder,
        namespace: str = DEFAULT_NAMESPACE,
        key_prefix: str = DEFAULT_PREFIX,
        key_builder: Callable[..., Any] = key_builder,
        cluster_mode: bool = False,
        startup_nodes=None,
        cluster_error_retry_attempts: int = 3,
        require_full_coverage: bool = True,
        skip_full_coverage_check: bool = False,
        reinitialize_steps: int = 10,
        read_from_replicas: bool = False,
    )

Setup cache instance with FastAPI


import logging
import uvicorn
from fastapi.applications import FastAPI
from cache_house.backends import RedisFactory
from cache_house.cache import cache

app = FastAPI()


@app.on_event("startup")
async def startup():
    print("app started")
    RedisFactory.init()


@app.on_event("shutdown")
async def shutdown():
    print("SHUTDOWN")
    RedisFactory.close_connections()

@app.get("/notcached")
async def test_route():
    print("notcached")
    return {"hello": "world"}


@app.get("/cached")
@cache()
async def test_route():
    print("cached") # second time when you request this print is not working
    return {"hello": "world"}

if __name__ == "__main__":
    uvicorn.run(app, port=8033)

You can set expire time (seconds) , namespace and key prefix in cache decorator


@cache(expire=30, namespace="app", key_prefix="test") 
async def test_cache(a: int,b: int):
    print("cached")
    return [a,b]


if __name__ == "__main__":
    print(asyncio.run(test_cache(1,2)))

Check stored cache

rdcli KEYS "*"
1) test:app:f665833ea64e4fc32653df794257ca06

If your function works with non-standard data types, you can pass custom encoder and decoder functions to the cache decorator.


import asyncio
import json
from cache_house.backends import RedisFactory
from cache_house.cache import cache

RedisFactory.init()

def custom_encoder(data):
    return json.dumps(data)

def custom_decoder(data):
    return json.loads(data)

@cache(expire=30, encoder=custom_encoder, decoder=custom_decoder, namespace="custom")
async def test_cache(a: int, b: int):
    print("async cached")
    return {"a": a, "b": b}


@cache(expire=30)
def test_cache_1(a: int, b: int):
    print("cached")
    return [a, b]


if __name__ == "__main__":
    print(asyncio.run(test_cache(1, 2)))
    print(test_cache_1(3, 4))

Check stored cache

rdcli KEYS "*"
1) cachehouse:main:8f65aed1010f0062a783c83eb430aca0
2) cachehouse:custom:f665833ea64e4fc32653df794257ca06

All examples works fine with Redis Cluster and single Redis instance.


Contributing

Free to open issue and send PR

cache-house supports Python >= 3.7

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cache_house-0.2.2.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

cache_house-0.2.2-py3-none-any.whl (7.6 kB view details)

Uploaded Python 3

File details

Details for the file cache_house-0.2.2.tar.gz.

File metadata

  • Download URL: cache_house-0.2.2.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.11 CPython/3.8.9 Darwin/21.2.0

File hashes

Hashes for cache_house-0.2.2.tar.gz
Algorithm Hash digest
SHA256 28b0d7cda39b034f18f607459b8033fe72e578d3d1c9ebb9bb931bda1113ceba
MD5 6a35d1cb9263a1bfd3d821d31fb185f7
BLAKE2b-256 73a8c5e6aa461ee215529fbebe40f9465f22c7ad9965d107ff025182dc1601cf

See more details on using hashes here.

File details

Details for the file cache_house-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: cache_house-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 7.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.11 CPython/3.8.9 Darwin/21.2.0

File hashes

Hashes for cache_house-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 86282d0f5bba5ea073e3c80f954ba45d91070bfb731b99e1abd7d96f11988698
MD5 c527163bfaaa0bd801075b7ebbce8c2e
BLAKE2b-256 8802879f42859ba7b40c0763ce1435930dd724880aa18fe58e8f4d8e22326ca0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page