Python cache for sync and async code
Project description
OneCache
Python cache for sync and async code.
Cache uses LRU algoritm. Cache can optionally have TTL.
Tested in python 3.6 and 3.9, for windows, mac and linux (see github status badge), it should work in versions between them.
Usage
from onecache import CacheDecorator
from onecache import AsyncCacheDecorator
class Counter:
def __init__(self, count=0):
self.count = count
@pytest.mark.asyncio
async def test_async_cache_counter():
"""Test async cache, counter case."""
counter = Counter()
@AsyncCacheDecorator()
async def mycoro(counter: Counter):
counter.count += 1
return counter.count
assert 1 == (await mycoro(counter))
assert 1 == (await mycoro(counter))
def test_cache_counter():
"""Test async cache, counter case."""
counter = Counter()
@CacheDecorator()
def mycoro(counter: Counter):
counter.count += 1
return counter.count
assert 1 == (mycoro(counter))
assert 1 == (mycoro(counter))
Decorator classes supports the following arguments
- maxsize (int): Maximun size of cache. default: 512
- ttl (int): time to expire in milliseconds, if None, it does not expire. default: None
- skip_args (bool): apply cache as the function doesn't have any arguments, default: False
- cache_class (class): Class to use for cache instance. default: LRUCache
- refresh_ttl (bool): if cache with ttl, This flag makes key expiration timestamp to be refresh per access. default: False
If num of records exceds maxsize, it drops the oldest.
TODO
- LRU cache
Development
Install packages with pip-tools:
pip install pip-tools
pip-compile
pip-compile dev-requirements.in
pip-sync requirements.txt dev-requirements.txt
Contribute
- Fork
- create a branch
feature/your_feature
- commit - push - pull request
Thanks :)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
onecache-0.3.1.tar.gz
(5.1 kB
view hashes)