Skip to main content

C implementation of Python 3 functools.lru_cache

Project description

C implementation of Python 3 functools.lru_cache. Provides speedup of 10-30x over standard library. Passes test suite from standard library for lru_cache.

Provides 2 Least Recently Used caching function decorators:

clru_cache - built-in (faster)
>>> from fastcache import clru_cache
>>> @clru_cache(maxsize=128,typed=False,state=None)
... def f(a, b):
...     return (a, )+(b, )
>>> type(f)
>>> <class '_lrucache.cache'>
lru_cache - python wrapper around clru_cache (slower)
>>> from fastcache import lru_cache
>>> @lru_cache(maxsize=128,typed=False,state=None)
... def f(a, b):
...     return (a, )+(b, )
>>> type(f)
>>> <class 'function'>

(c)lru_cache(maxsize=128, typed=False, state=None)

Least-recently-used cache decorator.

If maxsize is set to None, the LRU features are disabled and the cache can grow without bound.

If typed is True, arguments of different types will be cached separately. For example, f(3.0) and f(3) will be treated as distinct calls with distinct results.

If state is a list, the items in the list will be incorporated into argument hash.

Arguments to the cached function must be hashable.

View the cache statistics named tuple (hits, misses, maxsize, currsize) with f.cache_info(). Clear the cache and statistics with f.cache_clear(). Access the underlying function with f.__wrapped__.


Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastcache-0.3.3.tar.gz (12.3 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page