Skip to main content

Distributed Cache for Humans

Project description

dcache

PyPI Coverage Status Documentation Status

Distributed Cache for Humans

Installation

pip install dcache

How to Use

  • TODO

Contributing

Contributions are welcome, feel free to open an Issue or Pull Request.

Pull requests must be for the develop branch.

git clone https://github.com/HBN3tw0rk/dcache
cd dcache
git checkout develop
python -m venv .venv
pip install -r requirements_dev.txt
pre-commit install
pytest

Pitch (Portuguese)

What is

  • distributed cache for humans

  • simple API like lru_cache

  • multiple backends

  • easy to switch the backend

  • good documentation

API

dcache

from dcache import dcache

@dcache
def slow_function(n):
    return n ** 1000

dcache vs redis

import redis
redis = redis.Redis(host='localhost', port=6379, db=0)

def slow_function(n):
    cached = redis.get(n)
    if cached:
        return cached
    value = n ** 1000
    redis.set(n, value)
    return value

def slow_function2(n):
    cached = redis.get(n)
    if cached:
        return cached
    value = n ** 1000
    redis.set(n, value)
    return value
from dcache import cache
from dcache.backends import RedisBackend

cache = dcache(RedisBackend(host='localhost', port=6379, db=0))

@cache
def slow_function(n):
    return n ** 1000

@cache
def slow_function2(n):
    return n ** 1000

real example

def process(id, input):
    cache_path = get_content_cache_path(id, input)

    if resource.file_exist(cache_path):
        return resource.get_json(cache_path)

    response = slow_function(id, input)
      resource.put_json(body=response, file_path=cache_path)
    return response
from dcache import dcache
from dcache.backends import S3Backend

@dcache(S3Backend())
def process(id, input):
    return slow_function(id, input)

Ideas

  • integration tests using containers

multiple backends

from dcache import dcache
from dcache.backends import InMemoryBackend, RedisBackend

@dcache(multiple=[
    InMemoryBackend(),
    RedisBackend(host='localhost', port=6379, db=0),
])
def slow_function(n):
        return n ** 1000
  1. search on the in-memory cache;

  2. if exists, return, if not, search on Redis;

    • if exists on Redis, save in memory and return;

    • if not, exists on Redis, run the slow_function, save on Redis, save in-memory and return;

  • doesn’t run if already returned

MVP

  • in memory

Roadmap

  • backends: Redis, Memcached, Filesystem, database, S3, etc.

  • multiple backends

  • plugins

Changelog

(unreleased)

  • Add InMemory backend

0.0.1 (2022-07-30)

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dcache-0.0.2.tar.gz (18.9 kB view hashes)

Uploaded Source

Built Distribution

dcache-0.0.2-py2.py3-none-any.whl (14.3 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page