Skip to main content

Python utils and decorators for cаching with TTL, maxsize and file-based storage.

Project description

Status

WORK IN PROGRESS

Caching

Build Status Coverage Status Python Versions

Python utils and decorators for cаching with TTL, maxsize and file-based storage.

Installation

pip install caching

Usage

from caching import Cache

# File-based cache with unlimited ttl and maximum of 128 cached results
@Cache(ttl=-1, maxsize=128, filepath='/tmp/mycache')
def long_running_function(a, b, *args, c=None, **kwargs):
    pass

# Memory-based cache with limited ttl and maxsize and "least recently used"
# cache replacement policy.
@Cache(ttl=60, maxsize=128, policy='LRU')
def long_running_function(a, b, *args, c=None, **kwargs):
    pass

Advanced usage

from caching import Cache

# One cache for many functions

cache = Cache(filepath='/tmp/mycache', ttl=3600, maxsize=1024)

@cache
def pow(x, y):
    return x**y

@cache
def factorial(n):
    if n == 0:
        return 1
    return n * factorial(n-1)


# Caching the last result and returning it only in case of errors

@Cache(maxsize=1, only_on_errors=(ConnectionError, TimeoutError))
def api_request():
    """Request some remote resource which sometimes become unavailable.
    If this functions raises ConnectionError or TimeoutError, then the
    last cached result will be returned, if available."""


# Custom cache key function

@Cache(key=lambda x: x[0])
def toupper(a):
    global call_count
    call_count += 1
    return str(a).upper()

call_count = 0

# The key function returns the same result for both 'aaa' and 'azz'
# so the cached result from the first call is returned in the second call
assert toupper('aaa') == toupper('azz') == 'AAA'
assert call_count == 1


# Using cache as a key-value store

cache = Cache()

try:
    result = cache[1]
except KeyError:
    result = calculate_result(1)
    cache[1] = result
    assert 1 in cache
    assert cache[1] == result
    assert cache.get(1, None) == result
    assert cache.get(2, None) is None

# Cleanup

import os

cache = Cache(filepath='/tmp/mycache')
cache[1] = 'one'
assert 1 in cache
cache.clear()  # empty the cache
assert 1 not in cache
assert list(cache.items()) == []
assert os.path.isfile('/tmp/mycache')
cache.remove()  # Empty the cache and remove the underlying file
assert not os.path.isfile('/tmp/mycache')

Features

  • [x] Memory and file based cache.

  • [x] TTL and maxsize.

  • [x] Works with *args, **kwargs.

  • [x] Works with mutable function arguments of the following types: dict, list, set.

  • [x] FIFO, LRU and LFU cache replacement policies.

  • [x] Customizable cache key function.

  • [ ] Multiprocessing- and thread-safe.

  • [ ] Pluggable external caching backends (see Redis example).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

caching-py35-0.1.dev8.linux-x86_64.tar.gz (10.5 kB view details)

Uploaded Source

File details

Details for the file caching-py35-0.1.dev8.linux-x86_64.tar.gz.

File metadata

  • Download URL: caching-py35-0.1.dev8.linux-x86_64.tar.gz
  • Upload date:
  • Size: 10.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.2.0 requests-toolbelt/0.8.0 tqdm/4.26.0 CPython/3.5.2

File hashes

Hashes for caching-py35-0.1.dev8.linux-x86_64.tar.gz
Algorithm Hash digest
SHA256 86548a6312da58d9ab0939d778860c4a570d3177b724d32816f819bf76804c4f
MD5 d77100acaff056ac85eab203ea817822
BLAKE2b-256 8b775c910ab072ccbce43b66310ebe1a0cc10a2c2e8e08873c7d30111a67617b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page