A caching library for Python
Project description
A caching library for Python.
Links
Project: https://github.com/dgilland/cacheout
Documentation: https://cacheout.readthedocs.io
TravisCI: https://travis-ci.org/dgilland/cacheout
Features
In-memory caching using dictionary backend
Cache manager for easily accessing multiple cache objects
Reconfigurable cache settings for runtime setup when using module-level cache objects
Maximum cache size enforcement
Default cache TTL (time-to-live) as well as custom TTLs per cache entry
Bulk set, get, and delete operations
Memoization decorators
Thread safe
Multiple cache implementations:
FIFO (First In, First Out)
LIFO (Last In, First Out)
LRU (Least Recently Used)
MRU (Most Recently Used)
LFU (Least Frequently Used)
RR (Random Replacement)
Roadmap
Layered caching (multi-level caching)
Cache event listener support (e.g. on-get, on-set, on-delete)
Regular expression support in cache get
Set-on-missing callback support in cache get
Cache statistics (e.g. cache hits/misses, cache frequency, etc)
Requirements
Python >= 3.4
Quickstart
Install using pip:
pip install cacheout
Let’s start with some basic caching by creating a cache object:
from cacheout import Cache
cache = Cache()
By default the cache object will have a maximum size of 256 and default TTL expiration turned off. These values can be set with:
cache = Cache(maxsize=256, ttl=0, timer=time.time) # defaults
Set a cache key using cache.set():
cache.set(1, 'foobar')
Get the value of a cache key with cache.get():
assert cache.get(1) == 'foobar'
Set the TTL (time-to-live) expiration per entry:
cache.set(3, {'data': {}}, ttl=1)
assert cache.get(3) == {'data': {}}
time.sleep(1)
assert cache.get(3) is None
Memoize a function where cache keys are generated from the called function parameters:
@cache.memoize()
def func(a, b):
pass
Provide a TTL for the memoized function and incorporate argument types into generated cache keys:
@cache.memoize(ttl=5, typed=True)
def func(a, b):
pass
# func(1, 2) has different cache key than func(1.0, 2.0), whereas,
# with "typed=False" (the default), they would have the same key
Access the original memoized function:
@cache.memoize()
def func(a, b):
pass
func.uncached(1, 2)
Get a copy of the entire cache with cache.copy():
assert cache.copy() == {1: 'foobar', 2: ('foo', 'bar', 'baz')}
Delete a cache key with cache.delete():
cache.delete(1)
assert cache.get(1) is None
Clear the entire cache with cache.clear():
cache.clear()
assert len(cache) == 0
Perform bulk operations with cache.set_many(), cache.get_many(), and cache.delete_many():
cache.set_many({'a': 1, 'b': 2, 'c': 3})
assert cache.get_many(['a', 'b', 'c']) == {'a': 1, 'b': 2, 'c': 3}
cache.delete_many(['a', 'b', 'c'])
assert cache.count() == 0
Reconfigure the cache object after creation with cache.configure():
cache.configure(maxsize=1000, ttl=5 * 60)
Get keys, values, and items from the cache with cache.keys(), cache.values(), and cache.items():
cache.set_many({'a': 1, 'b': 2, 'c': 3})
assert list(cache.keys()) == ['a', 'b', 'c']
assert list(cache.values()) == [1, 2, 3]
assert list(cache.items()) == [('a', 1), ('b', 2), ('c', 3)]
Iterate over cache keys:
for key in cache:
print(key, cache.get(key))
# 'a' 1
# 'b' 2
# 'c' 3
Check if key exists with cache.has() and key in cache:
assert cache.has('a')
assert 'a' in cache
Manage multiple caches using CacheManager:
from cacheout import CacheManager
cacheman = CacheManager({'a': {'maxsize': 100},
'b': {'maxsize': 200, 'ttl': 900},
'c': {})
cacheman['a'].set('key1', 'value1')
value = cacheman['a'].get('key')
cacheman['b'].set('key2', 'value2')
assert cacheman['b'].maxsize == 200
assert cacheman['b'].ttl == 900
cacheman['c'].set('key3', 'value3')
cacheman.clear_all()
for name, cache in cacheman:
assert name in cacheman
assert len(cache) == 0
For more details, see the full documentation at https://cacheout.readthedocs.io.
Changelog
v0.7.0 (2018-02-22)
Changed default cache maxsize from 300 to 256. (breaking change)
Add Cache.memoize() decorator.
Add standalone memoization decorators:
memoize
fifo_memoize
lfu_memoize
lifo_memoize
lru_memoize
mru_memoize
rr_memoize
v0.6.0 (2018-02-05)
Add LIFOCache
Add FIFOCache as an alias of Cache.
v0.5.0 (2018-02-04)
Add LFUCache
Delete expired items before popping an item in Cache.popitem().
v0.4.0 (2018-02-02)
Add MRUCache
Add RRCache
Add Cache.popitem().
Rename Cache.expirations() to Cache.expire_times(). (breaking change)
Rename Cache.count() to Cache.size(). (breaking change)
Remove minimum arguement from Cache.evict(). (breaking change)
v0.3.0 (2018-01-31)
Add LRUCache.
Add CacheManager.__repr__().
Make threading lock usage in Cache more fine-grained and eliminate redundant locking.
Fix missing thread-safety in Cache.__len__() and Cache.__contains__().
v0.2.0 (2018-01-30)
Rename Cache.setup() to Cache.configure(). (breaking change)
Add CacheManager class.
v0.1.0 (2018-01-28)
Add Cache class.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.