Persistent cache for Python cachetools.
Project description
Shelved Cache
Persistent cache implementation for Python cachetools.
Behaves like any Cache
implementation, but entries are persisted to disk.
Original repository: https://github.com/mariushelf/shelved_cache
Usage example
from shelved_cache import PersistentCache
from cachetools import LRUCache
filename = 'mycache'
# create persistency around an LRUCache
pc = PersistentCache(LRUCache, filename=filename, maxsize=2)
# we can now use the cache like a normal LRUCache.
# But: the cache is persisted to disk.
pc["a"] = 42
pc["b"] = 43
assert pc["a"] == 42
assert pc["b"] == 43
# close the file
pc.close()
# Now in the same script or in another script, we can re-load the cache:
pc2 = PersistentCache(LRUCache, filename=filename, maxsize=2)
assert pc2["a"] == 42
assert pc2["b"] == 43
Use as a decorator
Just like a regular cachetools.Cache
, the PersistentCache
can be used with
cachetools
' cached
decorator:
import cachetools
from shelved_cache import PersistentCache
from cachetools import LRUCache
filename = 'mycache'
pc = PersistentCache(LRUCache, filename, maxsize=2)
@cachetools.cached(pc)
def square(x):
print("called")
return x * x
assert square(3) == 9
# outputs "called"
assert square(3) == 9
# no output because the cache is used
Features
persistent cache
See usage examples above.
Async decorators
The package contains equivalents for cachetools
' cached
and cachedmethod
decorators which support wrapping async methods. You can find them in the decorators
submodule.
They support both synchronous and asynchronous functions and methods.
Examples:
from shelved_cache import cachedasyncmethod
from cachetools import LRUCache
class A:
# decorate an async method:
@cachedasyncmethod(lambda self: LRUCache(2))
async def asum(self, a, b):
return a + b
a = A()
assert await a.asum(1, 2) == 3
class S:
@cachedasyncmethod(lambda self: LRUCache(2))
def sum(self, a, b):
return a + b
s = S()
assert s.sum(1, 2) == 3
Support for lists as function arguments
Using the autotuple_hashkey
function, list arguments are automatically converted
to tuples, so that they support hashing.
Example:
from cachetools import cached, LRUCache
from shelved_cache.keys import autotuple_hashkey
@cached(LRUCache(2), key=autotuple_hashkey)
def sum(values):
return values[0] + values[1]
# fill cache
assert sum([1, 2]) == 3
# access cache
assert sum([1, 2]) == 3
Changelog
0.3.0
- add support for Python 3.10 and 3.11
- better error message when trying to use the same file for multiple caches
- CI/CD pipeline
- fixes for documentation
0.2.1
- improved error handling
Acknowledgements
- cachetools by Thomas Kemmer
- asyncache by hephex
License
Author: Marius Helf (helfsmarius@gmail.com)
License: MIT -- see LICENSE
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file shelved_cache-0.3.0.tar.gz
.
File metadata
- Download URL: shelved_cache-0.3.0.tar.gz
- Upload date:
- Size: 6.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.11.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0695a696a83509f4b64868c3416a3d9e7106e900c973fae94277387105b95ae0 |
|
MD5 | 0b4dba4777e4394a56bfaca71fc1be9b |
|
BLAKE2b-256 | 0a1ed265e8e7b379822fb759996d539140630d7d58f629c8db23659d33b8ac73 |
File details
Details for the file shelved_cache-0.3.0-py3-none-any.whl
.
File metadata
- Download URL: shelved_cache-0.3.0-py3-none-any.whl
- Upload date:
- Size: 7.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.11.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | bc90a9f7752ef673e1734a2d95b9a3a7cd44c8c68ccdf724b1506b16d60099a9 |
|
MD5 | 9386e388eaafc289c0ea086626e2fadf |
|
BLAKE2b-256 | a06f749df508d50b36c50baba31ccbe831a42ecda33ecb707b95667c6041ced1 |