Distributed Cache for Humans
Project description
dcache
Distributed Cache for Humans
Documentation: https://dcache.readthedocs.io.
Installation
pip install dcache
How to Use
TODO
Contributing
Contributions are welcome, feel free to open an Issue or Pull Request.
Pull requests must be for the develop branch.
git clone https://github.com/HBN3tw0rk/dcache
cd dcache
git checkout develop
python -m venv .venv
pip install -r requirements_dev.txt
pre-commit install
pytest
Pitch (Portuguese)
What is
distributed cache for humans
simple API like lru_cache
multiple backends
easy to switch the backend
good documentation
API
dcache
from dcache import dcache
@dcache
def slow_function(n):
return n ** 1000
dcache vs redis
import redis
redis = redis.Redis(host='localhost', port=6379, db=0)
def slow_function(n):
cached = redis.get(n)
if cached:
return cached
value = n ** 1000
redis.set(n, value)
return value
def slow_function2(n):
cached = redis.get(n)
if cached:
return cached
value = n ** 1000
redis.set(n, value)
return value
from dcache import cache
from dcache.backends import RedisBackend
cache = dcache(RedisBackend(host='localhost', port=6379, db=0))
@cache
def slow_function(n):
return n ** 1000
@cache
def slow_function2(n):
return n ** 1000
real example
def process(id, input):
cache_path = get_content_cache_path(id, input)
if resource.file_exist(cache_path):
return resource.get_json(cache_path)
response = slow_function(id, input)
resource.put_json(body=response, file_path=cache_path)
return response
from dcache import dcache
from dcache.backends import S3Backend
@dcache(S3Backend())
def process(id, input):
return slow_function(id, input)
Ideas
integration tests using containers
multiple backends
from dcache import dcache
from dcache.backends import InMemoryBackend, RedisBackend
@dcache(multiple=[
InMemoryBackend(),
RedisBackend(host='localhost', port=6379, db=0),
])
def slow_function(n):
return n ** 1000
search on the in-memory cache;
if exists, return, if not, search on Redis;
if exists on Redis, save in memory and return;
if not, exists on Redis, run the slow_function, save on Redis, save in-memory and return;
doesn’t run if already returned
MVP
in memory
Roadmap
backends: Redis, Memcached, Filesystem, database, S3, etc.
multiple backends
plugins
Changelog
(unreleased)
Add InMemory backend
0.0.1 (2022-07-30)
First release on PyPI.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for dcache-0.0.2-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 32ca2b762bf30a6d7458e493f26c974acee708107bdcd8245ecff9f08cda4b87 |
|
MD5 | fc5425db674a719e60809e44546e03af |
|
BLAKE2b-256 | a49b1cc6c944d8da6cc59bf5ce52722bdb29a3412b3ac876d39863e264c9da76 |